Robosense sets new bar for affordable, powerful LiDAR sensors

Robosense sets new bar for affordable, powerful LiDAR sensors

By Scott Simmie

 

Building or modifying a robot?

Specifically, are you working on something with autonomy that needs to understand an unfamiliar environment? Then you’re likely looking at adding two key sensors: A depth camera and a LiDAR unit.

LiDAR (as most of you likely know), scans the surrounding environment with a continuous barrage of eye-safe laser beams. It measures what’s known as the “Time of Flight” – meaning the time it takes for the photons to be reflected off surrounding surfaces and return to the LiDAR unit. The closer that surface is, the shorter the Time of Flight. LiDARs calculate the time of each of those reflected beams and convert that into distance. Scatter enough of those beams in a short period of time (and LiDARs do), and you get an accurate digital representation of the surrounding environment – even while the robot is moving through it.

This is particularly useful for autonomous missions and especially for Simultaneous Localisation and Mapping, or SLAM. That’s where a LiDAR-equipped robot can be placed in a completely unfamiliar (and even GPS-denied) environment and produce a point-cloud map of its surroundings while avoiding obstacles. Quality LiDARs are also capable of producing 3D precision scans for a wide variety of use-cases.

All great, right? Except for one thing: LiDAR sensors tend to be very expensive. So expensive, they can be out of reach for an R&D team, academic institution or Startup.

There is, however, a solution: Robosense.

The company produces LiDAR sensors (both mechanical and solid-state) that rival the established players in the market. And they do so for about one-third of the cost of the industry heavyweights.

“The performance of Robosense is outstanding – absolutely on par with its main competitors in North America,” says InDro Account Executive Callum Cameron. “We have been integrating Robosense LiDAR on our products for about two years, and their performance is exceptional.”

Below: A fleet of four robots, equipped with Robosense LiDAR, which recently shipped to an academic client.

 

Robosense LiDAR

ROBOSENSE

 

The company might not yet be a household name (unless your household has robots), but as of May 2024 the firm had sold 460,000 LiDAR units. Its sensors power a large number of autonomous cars, delivery vehicles and other robots – and it’s the first company to achieve mass production of automotive-grade LiDAR units with its own in-house developed chip.

The company was founded in 2014, with some A-level engineering talent – and it’s been on a stellar trajectory ever since. One of the reasons is because Robosense produces all three core technologies behind its products: The actual chipsets, the LiDAR hardware, and the perception software. We’ll let the company itself tell you more:

“In 2016, RoboSense began developing its R Platform mechanical LiDAR. One year later, in 2017, we introduced our perception software alongside the automotive-grade M Platform LiDAR sensors tailored for advanced driver assistance and autonomous driving systems. We achieved the start-of-production (SOP) of the M1 in 2021, becoming the world’s first LiDAR company to mass-produce automotive-grade LiDAR equipped with chips developed in-house,” says its website.

The company now has thousands of engineers. And it didn’t take long before the world noticed what they were producing.

“As of May 17, 2024, RoboSense has secured 71 vehicle model design wins and enabled 22 OEMs and Tier 1 customers to start mass production of 25 models. We serve over 2,500 customers in the robotics and other non-automotive industries and are the global LiDAR market leader in cumulative sales volume.”

The company has also received prestigious recognition for its products, including two CES Innovation awards, the Automotive News PACE award, and the Audi Innovation Lab Champion prize.

“This company has standout features, including Field of View, point cloud density and high frame rates,” says Cameron. “If you look at that fleet of four robots we recently built, using the competition those LiDAR units alone would have come to close to $80,000. The Robosense solution cost roughly one-quarter of that with similar capabilities.”

And the factories? State of the art. Though this video focuses on its solid-state LiDAR, Robosense uses the same meticulous process for its mechanical units:

LiDAR FOR EVERY APPLICATION

 

Robosense produces many different LiDAR sensors. But what particularly appeals to us is that the company has (excuse the pun) a laser-like focus on the robotics industry. Its Helios multi-beam LiDAR units have been designed from the ground up for robots and intelligent vehicles. There are customisable fields of view, depending on application, and a near-field blind-spot of ≤ 0.2 metres. In addition, Helios LiDAR comes in 16- and 32-beam options depending on point-cloud density and FOV requirements. Both are capable of functioning in temperatures as low as -40° or on a scorching day in the Sahara desert. There’s also protection against multi-radar interference and strong light (which can be an issue with LiDAR). You can learn more about its features here.

Its Bpearl unit proves that very good things can indeed come in small packages. With a 360° horizontal and 90° vertical hemispherical FOV, it’s been designed for near-field blind spots, capable of detection at ≤10 cm. That’s why we selected it for a robot designed to inspect cycling lanes for hazards (while avoiding cyclists, of course). We actually have two Bpearls on that robot (one on each side), since detecting blind spots and avoiding other obstacles is so critical to this application.

“We’ve integrated both the Bpearl and Helios LiDAR units into multiple different robots and the performance has been excellent, even under adverse conditions,” says Cameron. “Obstacle avoidance has been outstanding, and SLAM missions are a snap.”

Below: This InDro robot features two 32-beam Robosense Bpearl LiDAR units. You can see one of them – that tiny bubble on the side (and there’s another one on the opposite side):

InDro Sentinel

THE THREE “D”s

 

You’ve likely heard this before, but robots are perfect for jobs that are Dirty, Dull or Dangerous – because they remove humans from those scenarios. Robots, particularly inspection robots, are often subjected to extremes in terms of weather and other conditions.

So this is a good place to mention that if a Robosense LiDAR encounters fog, rain, dust or snow it has a de-noising function to ensure it’s still capturing accurate data and that your point cloud isn’t a representation of falling snow. All of the Robosense LiDAR sensors have outstanding Ingress Protection ratings.

Because adverse conditions are quite likely to occur at some point during a robotic mission, Robosense puts its products through absolutely gruelling tests. Hopefully your robot won’t encounter the scenarios seen below, but if it does – the LiDAR will keep working:

INDRO’S TAKE

 

We take pride in putting only the highest quality sensors into our products.

Prior to adopting Robosense as our “go-to” LiDAR about two years ago, we were using big-name products. But those products also came with a big price tag. When we discovered the quality and price of Robosense LiDAR units, it was an obvious choice to make the switch. We have shipped multiple Robosense-enabled robots to clients, saving them thousands of dollars – in one case, tens of thousands – while still capturing every bit of data they require. Robosense is now our go-to, even on our flagship products. (We recently did a demonstration of one of our newer Helios-equipped autonomous quadrupeds to a high-profile client; they were amazed with the results.)

“Robosense is every bit the equal of the heavyweight LiDAR manufacturers, without the downside of the high cost,” says InDro Robotics CEO Philip Reece. “The field-of-view, point cloud density and quality of construction are all state-of-the-art, as are the manufacturing facilities. What’s more, Robosense continues to push the envelope with every new product it releases.”

Interested in learning more, including price and options? Contact Account Executive Callum Cameron right here, and he’ll give you all the info you need.

George Mason U. researchers enable robots to intelligently navigate challenging terrain

George Mason U. researchers enable robots to intelligently navigate challenging terrain

By Scott Simmie

 

Picture this: You’re out for a drive and in a hurry to reach your destination.

At first, the road is clear and dry. You’ve got great traction and things are going smoothly. But then the road turns to gravel, with twists and turns along the way. You know your vehicle well, and have navigated such terrain before.

And so, instinctually, you slow the vehicle to navigate the more challenging conditions. By doing so, you avoid slipping on the gravel. Your experience with driving, and in detecting the conditions, has saved you from a potential mishap. Yes, you slowed down a bit. But you’ll speed up again when the conditions improve. The same scenario could apply to driving on grass, ice – or even just a hairpin corner on a dry paved road.

For human beings, especially those with years of driving experience, such adjustments are second-nature. We have learned from experience, and we know the limitations of our vehicles. We see and instantly recognize potentially hazardous conditions – and we react.

But what about if you’re a robot? Particularly, a robot that wants to reach a destination at the maximum safe speed?

That’s the crux of fascinating research taking place at George Mason University: Building robots that are taught – and can subsequently teach themselves – how to adapt to changing terrain to ensure stable travel at the maximum safe speed.

It’s very cool research, with really positive implications.

Below: You don’t want this happening on a critical mission…

George Mason Xuesu Xiao Hunter SE

“XX”

 

Those are the initials of Dr. Xuesu Xiao, an Assistant Professor at George Mason University. He holds a PhD in Computer Science, and runs a lab that plays off his initials, called the RobotiXX Lab. Here’s a snippet of the description from his website:

“At RobotiXX lab, researchers (XX-Men) and robots (XX-Bots) perform robotics research at the intersection of motion planning and machine learning with a specific focus on robustly deployable field robotics. Our research goal is to develop highly capable and intelligent mobile robots that are robustly deployable in the real world with minimal human supervision.”

We spoke with Dr. Xiao about this work.

It turns out he’s particularly interested in making robots that are particularly useful to First Responders, and carrying out those dull, dirty and dangerous tasks. Speed in such situations can be critical, but comes with its own set of challenges. A robot that makes too sharp a turn at speed on a high friction surface can easily roll over – effectively becoming useless in its task. Plus, there are the difficulties previously flagged with other terrains.

This area of “motion planning” fascinates Dr. Xiao. Specifically, how to take robots beyond traditional motion planning and enable them to identify and adapt to changing conditions. And that involves machine vision and machine learning.

“Most motion planners used in existing robots are classical methods,” he says. “What we want to do is embed machine learning techniques to make those classical motion planners more intelligent. That means I want the robots to not only plan their own motion, but also learn from their own past experiences.”

In other words, he and his students have been focussing on pushing robots to develop capabilities that surpass the instructions and algorithms a roboticist might traditionally program.

“So they’re not just executing what has been programmed by their designers, right? I want them to  improve on their own, utilising all the different sources of information they can get while working in the field.”

 

THE PLATFORM

 

The RobotiXX Lab has chosen the Hunter SE from AgileX as its core platform for this work. That platform was supplied by InDro Robotics, and modified with the InDro Commander module. That module enables communication over 5G (and 4G) networks, enabling high speed data throughput. It comes complete with multiple USB slots and the Robot Operating System (ROS) library onboard, enabling the easy addition (or removal) of multiple sensors and other modifications. It also has a remote dashboard for controlling missions, plotting waypoints, etc.

Dr. Xiao was interested in this platform for a specific reason.

“The main reason is it is because it’s high speed, with a top speed of 4.8m per second. For a one-fifth/one-sixth scale vehicle that is a very, very high speed. And we want to study what will happen when you are executing a turn, for example, while driving very quickly.”

As noted previously, people with driving experience instinctively get it. They know how to react.

“Humans have a pretty good grasp on what terrain means,” he says. “Rocky terrain means things will get bumpy, grass can impede a motion, and if you’re driving on a high-friction surface you can’t turn sharply at speed. We understand these phenomenon. The problem is, robots don’t.”

So how can we teach robots to be more human in their ability to navigate and adjust to such terrains – and to learn from their mistakes?

As you’ll see in the diagram below, it gets *very* technical. But we’ll do our best to explain.

George Mason Hunter Xuesu Xiao

THE APPROACH

 

The basics here are pretty clear, says Dr. Xiao.

“We want to teach the robots to know the consequences of taking some aggressive maneuvers at different speeds on different terrains. If you drive very quickly while the friction between your tires and the ground is high, taking a very sharp turn will actually cause the vehicle to roll over – and there’s no way the robot by itself will be able to recover from it, right? So the whole idea of the paper is trying to enable robots to understand all these consequences; to make them ‘competence aware.'”

The paper Dr. Xiao is referring to has been submitted for scientific publication. It’s pretty meaty, and is intended for engineers/roboticists. It’s authored by Dr. Xiao and researchers Anuj Pokhrel, Mohammad Nazeri, and Aniket Datar. It’s entitled: CAHSOR: Competence-Aware High-Speed Off-Road Ground Navigation in SE(3).

That SE(3) term is used to describe how objects can move and rotate in 3D space. Technically, it stands for Special Euclidean group in three dimensions. It refers to keeping track of an object in 3D space – including position and orientation.

We’ll get to more of the paper in a minute, but we asked Dr. Xiao to give us some help understanding what the team did to achieve these results. Was it just coding? Or were there some hardware adjustments as well?

Turns out, there were both. Yes, there was plenty of complex coding. There was also the addition of an RTK GPS unit so that the robot’s position in space could be measured as accurately as possible. Because the team soon discovered that intense vibration over rough surfaces could loosen components, threadlock was used to keep things tightly in place.

But, as you might have guessed, machine vision and machine learning are a big part of this whole process. The robot needs to identify the terrain in order to know how to react.

We asked Dr. Xiao if an external data library was used and imported for the project. The answer? “No.”

“There’s no dataset out there that includes all these different basic catastrophic consequences when you’re doing aggressive maneuvers. So all the data we used to train the robot and to train our machine learning algorithms were all collected by ourselves.”

 

SLIPS, SLIDES, ROLLOVERS

 

As part of the training process, the Hunter SE was driven over all manner of demanding terrain.

“We actually bumped it through very large rocks many times and also slid it all over the place,” he says. “We actually rolled the vehicle over entirely many times. This was all very important for us to collect some data so that it learns to not do that in the future, right?”
 
And while the cameras and machine vision were instrumental in determining what terrain was coming up, the role of the robot’s Inertial Measurement Unit was also key.

“It’s actually multi-modal perception, and vision is just part of it. So we are looking at the terrain using camera images and we are also using our IMU. Those inertial measurement unit readings  sense the acceleration and the angular velocities of the robot so that it can better respond,” he says.

“Because ultimately it’s not only about the visual appearance of the terrain, it is also about how you drive on it, how you feel it.”

 

THE RESULTS

 

Well, they’re impressive.

The full details are outlined in this paper, but here’s the headline: Regardless of whether the robot was operating autonomously heading to defined waypoints, or whether a human was controlling it, there was a significant reduction in incidents (slips, slides, rollovers etc.) with only a small reduction in overall speed.

Specifically, “CAHSOR (Competence-Aware High-Speed Off-Road Ground Navigation) can efficiently reduce vehicle instability by 62% while only compromising 8.6% average speed with the help of TRON (visual and inertial Terrain Representation for Off-road Navigation).”

That’s a tremendous reduction in instability – meaning the likelihood that these robots will reach their destination without incident is greatly improved. Think of the implications for a First Responder application, where without this system a critical vehicle rushing to a scene carrying medical supplies – or even simply for situational awareness – might roll over and be rendered useless. The slight reduction in speed is a small price to pay for greatly enhancing the odds of an incident-free mission.

“Without using our method, a robot will just blindly go very aggressively over every single terrain – while risking rolling over, bumps and vibrations on rocks, maybe even sliding and rolling off a cliff.”

What’s more, these robots continue to learn with each and every mission. They can also share data with each other, so that the experience of one machine can be shared with many. Dr. Xiao also says the learnings from this project, which began in January of 2023, can also be applied to marine and even aerial robots.

For the moment, though, the emphasis has been fully on the ground. And there can be no question this research has profound and positive implications for First Responders (and others) using robots in mission-critical situations.

Below: The Hunter SE gets put through its paces. (All images courtesy of Dr. Xiao.)

Hunter SE George Mason Xuesu Xiao

INDRO’S TAKE

 

We’re tremendously impressed with the work being carried out by Dr. Xiao and his team at George Mason University. We’re also honoured to have played a small role in supplying the Hunter SE, InDro Commander, as well as occasional support as the project progressed.

“The use of robotics by First Responders is growing rapidly,” says InDro Robotics CEO Philip Reece. “Improving their ability to reach destinations safely on mission-critical deployments is extremely important work – and the data results are truly impressive.

“We are hopeful the work of Dr. Xiao and his team are adopted in future beyond research and into real-world applications. There’s clearly a need for this solution.”

If your institution or R&D facility is interested in learning more how InDro’s stable of robots (and there are many), please reach out to us here.

InDro builds, delivers custom robot to global client

InDro builds, delivers custom robot to global client

By Scott Simmie

 

We’ve built a new robot we’d like to tell you about.

It’s for a highly specialised use-case scenario for a global client. (And when we say global client, it’s a household name.)

This isn’t the first project where we’ve been tapped by a heavy-hitting company to design and build custom robots. We have ongoing contracts with others, where unfortunately NDAs prohibit us from disclosing pretty much anything. (We can tell you that one of the ground robots we’re building for one of those clients is pretty big.)

In this case, the client has agreed to let us tell you a fair bit about the product, providing we don’t reveal their name. We think this is a really intriguing robot, so we’re going to share some details – including images of the final product.

Here it is. And, by the way, it’s as tall as the average person. The sensor poking out on the right near the top of the cylindrical portion is positioned at eye-level.

Custom Robot

NOT A PIZZA OVEN

 

With that stretching, stovepipe-like neck, it might look like a pizza oven on wheels. But it’s not. It’s designed that way so that sensors can be roughly at the head height of human beings. The box at the bottom could be thought of as a computer on steroids.

That’s because the client wanted this robot for a very specific purpose: To be able to navigate complex crowds of people.

“The client wants to use Vision SLAM (Simultaneous Localisation and Mapping) to essentially detect humans and pathways through chaotic environments,” says Arron Griffiths, InDro’s Engineering Manager. Arron works out of our Area X.O location, where the robot was fabricated.

“Think malls, shopping centres, and stuff like that where humans are mingling to navigate around. And there’s no really defined path, the robot must organically move around people. Yes, you’d have an overall predetermined path with a desired destination, but once the chaos of humans comes in the robot would safely meander its way through crowds.”

 

LOTS OF TECHNOLOGY

 

That’s not a simple task. The client is going to supply its own autonomy software, but InDro had to work closely with them on the robot’s design and capabilities.

We mentioned earlier that this robot is SLAM-capable. That means it can map its surroundings in real time and make its own decisions – while it’s moving – about where in the ever-changing environment it makes sense to go to next. Two ZED depth cameras provide a detailed look at those surroundings (one close to the ground, the other at human eye level). So it’s constantly scanning, mapping, and making decisions about where to move next in real-time.

This is a data-dense task that requires a *lot* of onboard computing power.

“It’s basically a really powerful desktop computer on wheels,” says InDro Account Executive Luke Corbeth. “It’s outfitted with serious computational power, including the same graphic cards that people use to mine bitcoin.”

And that posed another challenge for our engineering team. The client wanted the robot to be able to operate for several hours at a time. But that advanced computing capability really puts a drain on power. 

“Once you stick these high-end computers into a battery powered robotic system, your run time drops like a stone,” explains Griffiths. “It’s a bit of a beast on power. That’s why we had to put a second battery into the unit. This is an excercise in finding a balance point, and producing a robot that will do a high-end deployment with all of this high end technology.”

Custom Robot Canada

CLIENT-CENTRED PROCESS

 

This wasn’t the first custom-robot that client has requested. The international company has a longer-term research project focussed on enabling a robot to navigate when surrounded by unpredictable human beings. It has developed, and will continue to tweak, its own autonomy software to carry out this task in conjunction with this robot.

InDro worked closely with the client on the design – both the technical requirements in terms of processors, sensors, graphic cards, run time – as well as the physical appearance. Because the client had some very tight timelines, InDro designed and built this robot in a very short period of time: Seven weeks from outset until the product was shipped.

“That’s extremely fast,” says Griffiths. “That’s the fastest custom robot I’ve seen in my working profession. You’ve got to think design cycles, manufacturing, outsourcing, testing. From this being nothing, to being shipped out in less than two months is incredible.”

 

SOLUTION-FOCUSED

 

But there’s a difference between carrying out an expedited task – and doing a rush job. The focus always had to remain on ensuring that the capabilities, design, build and testing of this machine would meet or exceed the client’s rigorous standards. And that meant even the tiniest details counted.

For example, we’d discovered with a previous robot using the same locomotion platform that there could be an issue on rough surfaces. Specifically, if you were turning a tight corner or accelerating while turning, the wheels could shudder and jump. This was especially an issue on asphalt and concrete.

InDro’s engineering team knew that with this robot any such shudders would be amplified due to the height of the machine; a minor shudder at the base would translate into significant wobbling at the robot’s top. That wasn’t something we wanted happening.

And so we created a solution. We covered the individual wheels with a 3D-printed wrap. This provides a barrier between the sticky rubber and ground, allowing the robot to slightly slide during such manoeuvres and avoiding those troubling vibrations.

 

Below: Detail of the wheels, with their new coating

Custom Robot

CLIENT REACTION

 

When we pack up and ship a custom build, the client always gets in touch after they’ve received the product. That’s the moment of truth – and the feedback we eagerly await.

Not long after the robot arrived, an email from the client landed. It included the following:

“The robot is fantastic,” they wrote. “The craftsmanship is superb; the power on the base is enabling; the intricate way in which the computer fits in the base housing is incredible; the compute box + mast feels ‘just right’ (there’s no template for social robot design, but I feel like we got very close).

“All these things make me really confident that, with the right algorithms (my responsibility) we can safely and efficiently navigate through crowds. It’s a really special robot that I can’t wait to put in the field.  Your team deserves a raise!”

This robot, though it can’t cook pizzas, is one of the most powerful Uncrewed Ground Vehicles InDro has built, at least in terms of raw onboard computational power. Engineering lead Griffiths believes its capabilities could make a variation of this machine suitable for other clients, as well.

“I think it’s a very good platform for clients who want very high computing power in a small form factor  that actually has some range, some longevity to it,” he says.

Below: Even when they’re under the gun, our engineering team takes it all in stride

Robotics Engineers

INDRO’S TAKE

 

We’re often working on projects like this. In fact, this isn’t the first major global client to tap InDro for custom builds. As our tagline states: “Invent. Enhance. Deploy.” That’s what we do.

“This was an expedited design, build and test of a completely new and computationally powerful robot,” says InDro Robotics CEO Philip Reece. “We know that InDro’s reputation rides on every product we ship and every service we provide. So we’re delighted to hear the client is as pleased with this robot as we are – and look forward to building more for them.”

Interested in what a powerhouse machine like this might do for you? Feel free to explore the possibilities by setting up a conversation with Account Executive Luke Corbeth.

 

CBC HIGHLIGHTS YOW DRONE DETECTION SYSTEM

CBC HIGHLIGHTS YOW DRONE DETECTION SYSTEM

By Scott Simmie

 

If you follow InDro Robotics, you’ll likely be aware that we were a co-founder and core technology partner of the YOW Drone Detection Pilot Project.

The system has been operating since the fall of 2020, and detects drone intrusions not only at the Ottawa International Airport, but as far as 40 kilometres away in the National Capital Region. Data from the project helps to inform airport protocols and is shared on a regular basis with Transport Canada and law enforcement.

Back during the “Freedom Convoy” protests in downtown Ottawa, the system got onto the mainstream radar after we published this story, which outlined the high number of unauthorised drone flights taking place in downtown Ottawa. The Ottawa Citizen covered that story here and it was also a cover story for WINGS Magazine.

Now, the system is back in the news for a different reason: The recent visit of US President Joe Biden to Ottawa.

President Biden

AIR FORCE ONE

 

Prior to the actual visit, advance teams from the Secret Service and Air Force One wanted to check out security and logistics at the Ottawa International Airport. And one of the first questions? Whether YOW had a drone detection system.

The answer, as you know, is Yes. We interviewed Michael Baudette, YOW’s VP of Security, Emergency Management and Customer Transportation. The resulting post garnered a lot of attention, including a lengthy interview by CBC Ottawa.

To view the segment on the Drone Detection Pilot project, check out the video below.

Ottawa International Airport, InDro, provide drone detection during Biden visit

Ottawa International Airport, InDro, provide drone detection during Biden visit

By Scott Simmie

 

A drone detection system described as “probably the best at any airport in the country” played a role in ensuring the safety of Air Force One during Joe Biden’s first visit as US President to Canada.

InDro Robotics is one of the key technology partners, supplying drone detection hardware and software for the Ottawa International Airport (YOW) Drone Detection Pilot Project. It detects drone flights both near YOW and much further afield.

In advance of President Biden’s visit, The US Secret Service, as well as an advance team from Air Force One, visited YOW as part of advance preparations.

The teams wanted to be briefed on airport security, including security measures for the skies and the airport grounds. That included learning about the capabilities of YOW’s Drone Detection Pilot Project – which has been accurately detecting drones at the airport and beyond for years. The program has gained significant media attention – including a cover story for WINGS magazine in 2022:

Drone detection

DRONE DETECTION

 

When the Secret Service and those involved with Air Force One visited YOW on an advance reconnaissance trip, one of the first questions asked was about drones.

“They asked do we have a drone detection capability – and we were quite proud to tell them that we have probably the best at any airport in the country,” says Michael Beaudette, VP of Security, Emergency Management and Customer Transportation at YOW.

“It provides us with situational awareness not only of the immediate area, but throughout the National Capital Region up to almost 40 kilometres.”

Certain areas of Ottawa’s downtown core are designated restricted airspace because of the House of Commons, Embassies and other sensitive locations.

“During his (Biden’s) visit we paid particular attention to anything flying near the Ottawa airport or downtown,” says Beaudette. “Law enforcement are aware of the capabilities we have. It’s a good partnership and we were happy to be able to give something back to the police and intelligence services.”

 

THE INDRO CONNECTION

InDro provides core technology for the drone detection system. Other technology partners include Accipiter Radar, Aerial Armor and Skycope – a Canadian firm whose tech includes a database of unique RF signatures emitted by multiple brands of drones. NAV Canada is part of the project, and Transport Canada is kept in the loop on the data generated by the operation.

The effectiveness of the system was proven during the massive convoy protest in downtown Ottawa early in 2022. It detected multiple flights of drones in restricted airspace where UAVs are not permitted to fly. Those detections were covered by the Ottawa Citizen.

Below: Some of the data captured during the 2022 convoy protest in Ottawa. A wealth of data on illegal drone flights was captured:

Drone Detection

ONGOING DETECTION

 

The system runs 24/7, and is capable of triggering an alert whenever a drone intrusion is detected. In mid-March, 2023, a week prior to the US President’s visit, the system indicated an attempted drone intrusion on airport property.

“The alarms went off and they were tracking it – but because of the geofence around the airport, the pilot couldn’t get control of the drone and put it back down again and departed airport property,” says Beaudette.

“That one’s in our investigations right now. Anything that happens a week out from the visit we look at it very closely. Is it someone doing a rehearsal to detect weak points? Is it a plane enthusiast having a look? Someone who bought a new drone at Costco and decided to try it out? While it’s a little more challenging because the individual left, we did get a license plate and we’re now just connecting the dots.”

 

A SIGNIFICANT INTRUSION

 

That’s not the only recent intrusion. In December of 2022, there was a much more serious incident.

A pilot popped up a drone directly in the vicinity of YOW runways, flying within a couple of hundred feet of where planes were landing. It was also a larger drone, which would have almost certainly caused damage were there a collision with crewed aircraft.

The system was capable of not only detecting the drone, but pinpointing the location of the pilot. Law enforcement was immediately dispatched, and the pilot was caught in the act.

Wary of recent global incidents, authorities at YOW kept a very close watch during the US Presidential visit:

“We’ve seen a lot of incidents where drones can pose a significant threat, and certainly the war in Ukraine has advanced the offensive nature of drone use considerably,” says Beaudette. “There’s also recent footage of drone infiltration into Russian military installations where they were able to land a drone on top of an aircraft undetected. So you really have to have the capability to detect and respond to those threats.”

While the system does not have mitigation capabilities at this point (jamming RF frequencies is very complex under Canadian regulations except in extraordinary circumstances), the system is highly capable of real-time drone detection and identification, as well as pinpointing the position of the operator.

Below: Michael Beaudette, VP of Security, Emergency Management and Customer Transportation at the Ottawa International Airport, during an interview with Scott Simmie

 

Drone Detection

EYES ON THE SKY

The system did detect some drone activity in the National Capital Region during President Biden’s visit, but nothing that was deemed to pose a threat.

Below: President Biden meets with Canadian Prime Minister Justin Trudeau March 23, 2023. Image via Prime Minister Trudeau’s Twitter account:

President Biden

MISSION ACCOMPLISHED

President Biden, along with his aides and a media contingent, departed YOW the evening of March 24.

No drone flights were detected at the Ottawa International Airport during the visit.

“There’s been nothing that looks like it’s targeting the airport or wanting to get a look at Air Force One,” said Beaudette at the close of the Presidential trip.

Below: File photo of US President Joe Biden boarding Air Force One.

President Joe Biden

INDRO’S TAKE

InDro was, obviously, pleased there were no drone incursions at YOW during the visit by the US President. But it’s nice to know there was a system in place that could have detected any drone flights during this important visit.

“The Drone Detection Pilot Project has proven its worth since its inception,” says InDro Robotics CEO Philip Reece. “Getting no detections and being assured there’s no potential threat is just as valuable as identifying incursions – especially during a high-security event such as this.

“We’re proud of this ongoing project and our partnership with YOW, NAV Canada, and our technology partners Accipiter, Aerial Armor and Skycope. We believe this has proven to be an effective model, and one that could be deployed with confidence at other major airports or sensitive facilities.”

Reports are generated on a monthly basis by the YOW drone detection system; we’ll be sure to update you when news warrants. And speaking of that, we issued a news release on this as well. You can download it here.

CONTACT

INDRO ROBOTICS
305, 31 Bastion Square,
Victoria, BC, V8W 1J1

P: 1-844-GOINDRO
(1-844-464-6376)

E: Info@InDroRobotics.com

copyright 2022 © InDro Robotics all rights reserved

CONTACT

INDRO ROBOTICS
305, 31 Bastion Square,
Victoria, BC, V8W 1J1

P: 1-844-GOINDRO
(1-844-464-6376)

E: Info@InDroRobotics.com

copyright 2022 © InDro Robotics all rights reserved

Indro Robotics provides live drone video feed at Montreal Marathon in pilot medical project

Indro Robotics provides live drone video feed at Montreal Marathon in pilot medical project

By Scott Simmie

 

The Montreal Marathon, 2022 edition, was held over the weekend. The main event, the signature 42-kilometre run, took place early Sunday. And three InDro Robotics engineers were there.

They weren’t running, but were instead providing a live feed from drones. Those live feeds were being monitored on large video monitors by dedicated research assistants. They were assessing the quality of the feeds and their usefulness in detecting runners who might be in need of medical assistance.

Below: Team InDro, wearing safety vests, with Montreal Marathon runners on the right

Montreal Marathon

Research project

 

InDro became involved with this through Dr. Valérie Homier, an Emergency Physician at McGill University Health Centre. She has long had an interest in how drones can be used in the health care sector, and has collaborated with InDro on two previous research projects.

One of those projects evaluated whether drones or ground delivery could transport simulated blood products more efficiently to a trauma facility – the Montreal General Hospital. Drones were faster.

The second project studied whether drones could help identify swimmers in distress at an IRONMAN event in Mont-Tremblant. You can find that research here.

With the Montreal Marathon coming up, Dr. Homier knew there would likely be medical events. There generally are.

“In these long-distance sporting events there are usually some significant injuries, including cardiac events and heat strokes,” she says.

These tend to be more likely in the later phases of events like marathons, after the athlete has already been under stress for an extended time. The thinking was that perhaps drones could be a useful tool.

Dr. Homier was particularly interested in whether two drones in the air, covering two critical segments toward the end of the marathon, could provide useful data. Specifically, would the live video feed be consistent enough in quality and resolution to be a useful tool?

This pilot aimed to find out.

Below: An uphill segment near the Montreal Marathon finish line. This is was the target area for one of the Indro Robotics drones 

 

Montreal Marathon

InDro’s role

 

There was a lot of planning required for the mission to ensure the drones could provide continuous coverage and be safe for flying in an area with so many people. Project Manager Irina Saczuk (who happens to also be an RN) worked closely with Dr. Homier to help figure out the nuts and bolts of the InDro side of things.

InDro assigned three employees from the Area X.O facility to the project: Software developers Ella Hayashi and Kaiwen Xu, along with mechatronics specialist Liam Dwyer. All three hold Advanced RPAS certificates and took part in planning meetings to understand the mission and their roles. They also looked into optimising the drones’ video feeds to ensure the best quality would reach those monitoring remotely on large screens.

“At big-scale events such as this marathon, lots of people could go down with injuries,” says InDro’s Ella Hayashi. “But it can be hard to get timely support because roads are blocked. So drones have the potential to really help with sharing the precise location and other information when a person may need help.”

Worth noting here: The InDro engineers/pilots were not to be actively ‘looking’ for people in medical distress. Their role was simply to pilot the drones at the assigned locations and maintain a video feed that offered those watching the large-screen monitors with good situational awareness. In the event of an emergency, the pilots were to follow instructions, including moving in closer to a runner in distress.

 

Sub-250 grams

 

The team took four DJI Mini 2 drones to Montreal. Though InDro has a fleet of much larger and sophisticated drones the company has built, these consumer drones were perfect for the job. That’s because the Mini 2 is a sub-250 gram drone that can be flown near and over people. In the exceedingly rare event of a failure, the small device is unlikely to cause any substantial injury to someone on the ground. They’re also capable of very good video quality.

The team also used a third-party app – Airdata – to carry the video streams. The app created secure links for each drone’s feed that could be shared with those who would be monitoring the feed. Three drones were to be used in rotation so that two drones were always in the air providing live video at any given time. A fourth drone was onsite for backup.

“We modified the parameters and were streaming in 720p,” explains Dwyer. “We selected a lower resolution because on the bigger screen it didn’t have to be crystal clear but it needed to be smooth.”

There was, initially, some concern over whether the local LTE network would be able to handle the feed due to the large number of people using cellphones to capture and stream from the finish line.

“The night before the mission, a medical person told us there were going to be 20,000 people around the stadium,” says Xu. “We were worried about network connectivity, it was possible that our video streaming would not work. But actually the network was pretty good that day.”

Below is a drone selfie of the InDro team: From left to right, Kaiwen Xu, Ella Hayashi, Liam Dwyer

 

Live Drone Video Feed

A useful exercise

 

Remember: This was simply a pilot project to determine if drones could provide a clean video stream that might be useful. The pilots were to focus on hovering the drones in two specific adjacent locations, with some overlap in their video to ensure they were not missing a spot of this critical part of the marathon.

“Our job was 100 per cent flying the drones,” says Dwyer. “Just straightforward, wide-angle shots with all runners in the field of view.”

We should mention here that InDro also took part in a simulated cardiac event prior to the marathon reaching this area. A medical dummy was placed in a location and one of the drone pilots was instructed to get closer for a better look. A small electric vehicle – think a large golf cart adapted for First Responder use – was dispatched. Chest compressions were performed on the dummy, which was then loaded into the vehicle. A drone followed as the vehicle drove to a nearby stadium and the victim was transported inside to the treatment area. The feed gave others on the Medi-Drone team an opportunity to see, in real-time, the progress of the patient’s arrival.

“The drone response really gave them an active timeline of when they should expect to receive this patient,” says Dwyer.

So the drones proved useful during a simulation. But how would they perform with runners during the actual marathon?

Below: The downhill segment monitored by InDro Robotics

Montreal Marathon Drone Video

From simulation to real-world

 

As the lead runners came in, the field wasn’t crowded. But, of course, it would become more congested.

When athletes are moving together en masse like this, Dr. Homier says there’s a certain flow that can be observed from the drone. Because that flow is consistent and smooth, a runner in distress literally pops up as looking out of place.

And it happened. Those watching the live feed spotted someone who appeared to be in distress. They had stopped, were hanging on to a railing on the side of the course. Then they fell over the railing, dropping to the grass. A drone pilot was asked to move in for a closer look. It was clear this runner needed help.

In fact, while the pilots were intended to simply hover their drones, Dr. Homier had anticipated such a scenario, and built it into the protocol for the pilot project. Suddenly, an InDro pilot had become part of a First Responder team, providing much-needed situational awareness.

“It was embedded in the research protocol, that eyes on the event becomes what is required,” she explains. “It was called into dispatch and pilots were able to provide eyes on the incident. That was amazing; dispatch came down after and brought us a radio.”

 

Lessons learned

 

For Dr. Homier, there’s still work ahead and a lot of data to be analyzed.

“There’s a lot to learn from this project, and there’s a way forward for multiple surveillance methods,” she says.  “And the drones are way up there. The view from above when monitoring moving crowds is just incomparable.”

Plus, says Dr. Homier, the project sparked a tremendous amount of interest from other healthcare professionals on site.

“The interest was incredible, coming from the drone pilots, the students, the medical directors, the medical staff – they all thought it was so cool,” she says.

“We’re talking about 250 people involved in the medical team. Many came to see the viewing station, so in terms of letting people know about this new use of the technology – that was also a great success.”

Below: Mission accomplished! Team InDro is joined by key members of the marathon’s medical response team for this post-race drone selfie

Montreal Marathon

InDro’s take

 

We’re proud to be involved with this project – just as we’re proud to have collaborated previously with Dr. Valérie Homier on other research projects involving drones. In fact, we find this kind of research particularly meaningful.

“For us, using drones for good is much more than a catchy hashtag,” says InDro Robotics CEO Philip Reece. “Aerial and ground robots can perform so many useful tasks. We’ve helped securely deliver prescriptions to remote locations, COVID test supplies, and more. But playing a role in helping to ensure that someone in medical distress receives timely assistance is up near the top of the list. We look forward to the next project with Dr. Homier.”

And nice job, Ella, Kaiwen and Liam.

PS: We’ve issued a news release about this project. You can read it here.