Still a long road to fully autonomous passenger cars

Still a long road to fully autonomous passenger cars

By Scott Simmie

 

We hear a lot about self-driving cars – and that’s understandable.

There are a growing number of Teslas on the roads, with many owners testing the latest beta versions of FSD (Full Self-Driving) software. The latest release allows for automated driving on both highways and in cities – but the driver still must be ready to intervene and take control at all times. Genesis, Hyundai and Kia electric vehicles can actively steer, brake and accelerate on highways while the driver’s hands remain on the wheel. Ford EVs offer something known as BlueCruise, a hands-free mode that can be engaged on specific, approved highways in Canada and the US. Other manufacturers, such as Honda, BMW and Mercedes, are also in the driver-assist game.

So a growing number of manufacturers offer something that’s on the path to autonomy. But are there truly autonomous vehicles intended to transport humans on our roads? If not, how long will it take until there are?

Good question. And it was one of several explored during a panel on autonomy (and associated myths) at the fifth annual CAV Canada conference, which took place in Ottawa December 5. InDro’s own Head of Robotic Solutions (and Tesla owner) Peter King joined other experts in the field on the panel.

 

Autonomous Cars

Levels of autonomy

 

As the panel got underway, there were plenty of acronyms being thrown around. The most common were L2 and L3, standing for Level 2 and Level 3 on a scale of autonomy that ranges from zero to five.

This scale was created by the Society of Automotive Engineers as a reference classification system for motor vehicles. At Level 0, there is no automation whatsoever, and all aspects of driving require human input. Think of your standard car, where you basically have to do everything. Level 0 cars can have some assistive features such as stability control, collision warning and automatic emergency braking. But because none of those features are considered to actually help drive the car, such vehicles remain in Level 0.

Level 5 is a fully autonomous vehicle capable of driving at any time of the day or night and in any conditions, ranging from a sunny day with dry pavement through to a raging blizzard or even a hurricane (when, arguably, no one should be driving anyway). The driver does not need to do anything other than input a destination, and is free to watch a movie or even sleep during the voyage. In fact, a Level 5 vehicle would not need a steering wheel, gas pedal, or other standard manual controls. It would also be capable of responding in an emergency situation completely on its own.

Currently, the vast majority of cars on the road in North America are Level 0. And even the most advanced Tesla would be considered Level 2. There is a Level 3 vehicle on the roads in Japan, but there are currently (at least to the best of our knowledge and research), no Level 3 vehicles in the US or Canada.

As consumer research and data analytics firm J.D. Power describes it:

“It is worth repeating and emphasizing the following: As of May 2021, no vehicles sold in the U.S. market have a Level 3, Level 4, or Level 5 automated driving system. All of them require an alert driver sitting in the driver’s seat, ready to take control at any time. If you believe otherwise, you are mistaken, and it could cost you your life, the life of someone you love, or the life of an innocent bystander.”

To get a better picture of these various levels of autonomy, take a look at this graphic produced by the Society of Automotive Engineers International.

Autonomy

Now we’ve got some context…

 

So let’s hear what the experts have to say.

The consensus, as you might have guessed, is that we’re nowhere near the elusive goal of a Level 5 passenger vehicle.

“Ten years ago, we were all promised we’d be in autonomous vehicles by now,” said panel moderator Michelle Gee, Business Development and Strategy Director with extensive experience in the automotive and aerospace sectors. Gee then asked panelists for their own predictions as to when the Level 4 or 5 vehicles would truly arrive.

“I think we’re still probably about seven-plus years away,” offered Colin Singh Dhillon, CTO with the Automotive Parts Manufacturers’ Association.

“But I’d also like to say, it’s not just about the form of mobility, you have to make sure your infrastructure is also smart as well. So if we’re all in a bit of a rush to get there, then I think we also have to make sure we’re taking infrastructure along with us.”

Autonomous Cars

It’s an important point.

Vehicles on the path to autonomy currently have to operate within an infrastructure originally built for human beings operating Level 0 vehicles. Such vehicles, as they move up progressive levels of autonomy, must be able to scan and interpret signage, traffic lights, understand weather and traction conditions – and much more.

Embedding smart technologies along urban streets and even on highways could help enable functionalities and streamline data processing in future. If a Level 4 or 5 vehicle ‘knew’ there was no traffic coming at an upcoming intersection, there would be no need to stop. In fact, if *all* vehicles were Level 4 or above, smart infrastructure could fully negate the need for traffic lights and road signs entirely.

 

Seven to 10 years?

 

If that’s truly the reality, why is there so much talk about autonomous cars right now?

The answer, it was suggested, is in commonly used – but misleading – language. The term “self-driving” has become commonplace, even when referring solely to the ability of a vehicle to maintain speed and lane position on the highway. Tesla refers to its beta software as “Full Self-Driving.” And when consumers hear that, they think autonomy – even though such vehicles are only Level 2 on the autonomy scale. So some education around langage may be in order, suggested some panelists.

“It’s the visual of the word ‘self-driving’ – which somehow means: ‘Oh, it’s autonomous.’ But it isn’t,” explained Dhillon. “…maybe make automakers change those terms. If that was ‘driver-assisted driving,’ then I don’t think people would sleeping at the wheel whilst they’re on the highway.”

One panelist suggested looking ahead to Level 5 may be impractical – and even unnecessary. Recall that Level 5 means a vehicle capable of operating in all conditions, including weather events like hurricanes, where the vast majority of people would not even attempt to drive.

“It’s not safe for a human to be out in those conditions…I think we should be honing down on the ‘must-haves,’ offered Selika Josaih Talbott, a strategic advisor known for her thoughtful takes on autonomy, EVs and mobility.

“Can it move safely within communities in the most generalised conditions? And I think we’re clearly getting there. I don’t even know that it’s (Level 5) something we need to get to, so I’d rather concentrate on Level 3 and Level 4 at this point.”

 

Autonomous Cars

InDro’s Peter King agrees that Level 5 isn’t coming anytime soon.

I believe the technology will be ready within the next 10 years,” he says. “But I believe it’ll take 30-40 years before we see widespread adoption due to necessary changes required in infrastructure, regulation and consumer buy-in.”

And that’s not all.

“A go-to-market strategy for Level 5 autonomy is a monumental task. It involves significant investments in technology and infrastructure – and needs to be done so in collaboration with regulators while also factoring in safety and trust from consumers with a business model that is attainable for the masses.”

What about robots?

Specifically, what about Uncrewed Ground Vehicles like InDro’s Sentinel inspection robot, designed for monitoring remote facilities like electrical substations and solar farms? Sentinel is currently teleoperated over 4G and 5G networks with a human controlling the robot’s actions and monitoring its data output. 

Yet regular readers will also know we recently announced InDro Autonomy, a forthcoming software package we said will allow Sentinel and other ROS2 (Robot Operating System) machines to carry out autonomous missions.

Were we – perhaps like some automakers – overstating things?

“The six levels of autonomy put together by the SAE are meant to apply to motor vehicles that carry humans,” explains Arron Griffiths, InDro’s lead engineer. In fact, there’s a separate categorization for UGVs.

The American Society for Testing and Materials (ASTM), which creates standards, describes those tiers as follows: “Currently, an A-UGV can be at one of three autonomy levels: automatic, automated, or autonomous. Vehicles operating on the first two levels (automatic and automated) are referred to as automatic guided vehicles (AGVs), while those on the third are called mobile robots.”

“With uncrewed robots like Sentinel, we like to think of autonomy as requiring minimal human intervention over time,” explains Griffiths. “Because Sentinel can auto-dock for wireless recharging in between missions, we believe it could go for weeks – quite likely even longer – without human intervention, regardless of whether that intervention is in-person or virtual,” he says.

“The other thing to consider is that these remote ground robots, in general, don’t have to cope with the myriad of inputs and potential dangers that an autonomous vehicle driving in a city must contend with. Nearly all of our UGV ground deployments are in remote and fenced-in facilities – with no people or other vehicles around.”

So yes, given that InDro’s Sentinel will be able to operate independently – or with minimal human intervention spread over long periods – we are comfortable with saying that machine will soon be autonomous. It will even be smart enough to figure out shortcuts over time that might make its data missions more efficient.

It won’t have the capabilities of that elusive Level 5 – but it will get the job done.

InDro’s take

 

Autonomy isn’t easy. Trying to create a fully autonomous vehicle that can safely transport a human (and drive more safely than a human in all conditions), is a daunting task. We expect Level 5 passenger vehicles will come, but there’s still a long road ahead.

Things are easier when it comes to Uncrewed Ground Vehicles collecting data in remote locations (which is, arguably, where they’re needed most). They don’t have to deal with urban infrastructure, unpredictable drivers, reading and interpreting signage, etc.

That doesn’t mean it’s easy, of course – but it is doable.

And we’re doing it. Stay tuned for the Q1 release of InDro Autonomy.

 

Coming soon: InDro Autonomy software

Coming soon: InDro Autonomy software

By Scott Simmie

We’ve got news.

InDro Robotics is in the final stages of a new product that will vastly improve the capabilities of ground robots running on ROS2 or ROS1 Robot Operating System libraries. In short, it will transform a remotely tele-operated machine into an autonomous one. The software – no surprise here – is called InDro Autonomy.

“It’s our solution to automating the mundane tasks of outdoor navigation,” says Arron Griffiths, lead engineer at InDro’s Area X.O R&D facility in Ottawa, Ontario.

As you know, ground and aerial robots excel at jobs often referred to as “The Three Ds” – meaning tasks that are dirty, dull or dangerous (and sometimes all three). We would actually add a fourth “D” to that set and include jobs that are distant.

Think, for example, of remote locations such as electrical substations or solar farms. Currently, most robots (including many InDro products in the field) are remotely tele-operated by someone connecting with those robots over 4G or 5G networks. That’s an awesome capability, and one being put to use regularly by InDro clients. (In fact, we recently did a demonstration for T-Mobile at an Analyst’s Summit in Washington State. Attendees could operate a robot using an Xbox controller in real-time. The robot was in Ottawa – some 4,000 kilometres away.)

So tele-operation will still be with us, and is exceedingly effective for many use-cases. But there are some tasks where autonomy would be preferable.

Below: InDro’s Austin Greisman tweaks a Sentinel robot at our Area X.O facility

Robot Autonomy

Why autonomy?

 

While tele-operations are appropriate for a variety of inspection and surveillance tasks, there are cases where autonomy is a preferable option. Specifically, when a robot is repeatedly carrying out the same route/mission. This is especially true when the path becomes larger, such as inspections at solar farms. Why assign a human being to control a robot for a lengthy and repetitive task when a robot can carry it out on its own? That’s where InDro Autonomy comes in.

“Solar farms can be absolutely massive and very hard to maintain,” explains Griffiths. “Having a human walk or even drive through with a thermal camera is very inefficient. Our Sentinel ground robot, for example, can do this on its own.”

InDro’s Austin Greisman, who has been integral to InDro Autonomy’s development, puts it like this:

“Performing routine inspection checks – going to see the same thing over and over again – can be automated with this type of system,” he says. “So you just collect the data.”

 

Compatability

 

InDro Autonomy was inspired by our exclusive Commander module. It’s a bolt-on solution for ground robots that allows for rapid customisation and sensor integration. It does so by containing the complete Robot Operating System (ROS2) software library onboard, as well as an Edge computer for real-time processing. With its 4G and 5G tele-operating capabilities, open USB slots, built-in camera and slick User Interface, it’s a snap to add and integrate sensors and carry out remote missions via dashboard.

While InDro Autonomy was developed to work seamlessly with Commander-enabled robots, our developers wanted to maximise interoperability. So while Commander takes the pain out of robot customisation, you don’t have to have it to deploy InDro Autonomy.

“Not only does InDro Autonomy fully work with any platform Commander supports, InDro Autonomy will work with any ROS2-based Robot that has an IMU, GPS, and wheel odometry,” says Greisman.

Here’s a look at the Commander module, the box that’s the brains of Sentinel. It’s directly above the robot treads.

 

InDro Sentinel

How does it work?

 

With lots of coding, lol.

But there’s more than that, of course. InDro Autonomy works by recording its path via GPS during an initial tele-operated cruise through the desired route. In the case of Commander-enabled robots, that GPS trail is immensely accurate.

“We have a very high quality GPS unit in Commander that is accurate to around 5 cm,” explains Greisman. “As the robot moves, it drops GPS breadcrumbs. Later on, at any time, even if you reboot it and drop it somewhere random, it will understand where it is relative to where it’s been – so if can figure out where it needs to go next.”

 

Wait, there’s more!

 

And there is.

Greisman says InDro Autonomy has been designed for even very large spaces. Often, covering more territory requires greater onboard computational power. Not with this system.

“You can be in as big of a space as you like and there’s no computational limitation to that. What sometimes happens with autonomy systems, as you go to a bigger space you need more computational power, and this system has been designed to be efficient on low compute platforms. So this allows us to keep costs low for the client as well.”

 

Anomaly detection, alerts

 

As InDro Autonomy gets ready for a Q1 2023 release, InDro’s R&D team has a few other features it’s planning on adding to the software.

Because it’s all about autonomous missions, it’s assumed that not every inspection will be monitored in real-time. All captured data will be georeferenced and stored, of course, with the ability to review any mission remotely. But InDro thought it would be good to take things even further.

InDro is baking in the ability to customize the software per customer requests. So, for example, if components at an electrical substation should not exceed 50° Celsius, an alert will be sent by text or email if the robot detects temperatures exceeding that parameter. Engineers are also looking into integrating an early warning system for arcing, which could be triggered by its unique sonic signature. Plus, Commander-enabled robots will have the ability to autonomously snug right up to their charging docks.

So InDro Autonomy has a lot of features. And they’re coming soon.

InDro’s take

 

When InDro engineers developed the InDro Commander, we knew we had something special. And it wasn’t long before a growing number of clients realized that fact. Our Commander module became popular because it simplified a previously painstaking task, making a tough and time-consuming job easier.

We feel the same way about InDro Autonomy – and we believe our clients will, too.

Hallowe’en transformation: From Unitree GO 1 to Pokémon character Jolteon

Hallowe’en transformation: From Unitree GO 1 to Pokémon character Jolteon

By Scott Simmie

 

What could possibly be better than dressing up for Hallowe’en?

For Dave Niewinski, the answer was clear: Dress up a robot for Hallowe’en. Specifically, transform a quadruped robot into a Pokémon character.

“This was definitely a different project than what I normally do – usually they’re more technical,” explains Niewinski. “I just wanted to try something different.”

And it doesn’t get more different than this:

Advanced Robotics

A little background

 

Dave Niewinski is an engineer, one who has long had a passion for robotics. He does contract and consulting work for a number of major robotics companies (including InDro) and others interested in learning more about robotic solutions. That’s what pays the bills.

But Niewinski also wears another hat: Builder of fantastical creations that wind up on his Dave’s Armoury website and YouTube channel. You’ll find a ton of amazing videos on that channel – everything from a robotic arm lighting up fireworks for Canada Day through to a custom setup for pouring beer from a tap. (The “arm” in “Dave’s Armoury” comes from his frequent work with robotic arms.)

The YouTube channel provides Niewinski with an outlet for his creative side, a place where he builds devices with wild new capabilities, while also entertaining and educating his audience along the way.

“I end up making YouTube videos to (A): Have fun. But also partly for just educating people,” he says. “Usually when I write code I put the code up online. I also like showing that robots aren’t some scary unattainable thing.”

As it turns out, InDro Robotics had loaned Niewinski a Unitree GO 1 robot to play around with. The quadruped already looks a bit like an animal, and that got him thinking: What if dressed this up for Hallowe’en?

And so he did, choosing to transform the GO 1 into Jolteon. This wasn’t a simple task, and at one point in the transformation GO 1 looked more than a bit like an unfortunate lamb:

 

Robotics Engineering

With a little help from his neighbour – and some 200 hours of 3D printing – Jolteon took shape.

Unlike most of Niewinski’s projects, there wasn’t any coding involved with this project. It was simple (and not-so-simple) cosmetics.

Check out the full video showcasing how GO 1 became Jolteon: We guarantee you it’s totally worth your time. (And if you like it, subscribe to his amazing channel.)

The reaction

 

Niewinski took Jolteon for a walk around the neighborhood – and people were amazed by his creation.

“For a lot of people my age (he’s 33), our childhood was Pokémon. So to actually see it walking around in reality was unattainable until now. People loved it,” he says.

His kids loved it, too. Maddy (five) and Ollie (three) are used to seeing robots around the house. So much so, that they give them nicknames. The GO 1 is affectionately known as ‘Max’ and the AgileX Bunker Pro is known as ‘Frank the Tank.’

Speaking of the Frank: “He pulls all the kids around town. I’ve got a La-Z-Boy on top of him at the moment.”

Below: Jolteon’s 3D-printed head…

Canada Robotics

Robots keep getting better

 

Just a few short years ago, an affordable quadruped like the Unitree GO 1 would have been unthinkable. But with the increase in computing power (EDGE computing), better and cheaper sensors, plus advances in robot hardware – robots just keeping getting better at a near-exponential pace. Niewinski refers to the sensors, hardware and raw computing power as the three essential forces driving this change.

“All three of those are advancing so quickly, and they all rely on each other,” he says.

“You could have the best dog hardware, but if you don’t have great cameras or great processing power, it doesn’t really matter. Those three pillars of robotics are all advancing, (and) we’re going to continue seeing ridiculous leaps in robotics.”

B1 Quadruped

InDro’s take

 

It’s always a pleasure both working with Dave Niewinski on the serious stuff – and seeing the amazing projects he creates on his own. It’s terrific to see what passion, combined with technical talent, can produce.

We also agree with him about those three pillars of robotics. Whether drones or ground robots, we’ve seen phenomenal leaps in technology in recent years. Better sensors, better robot hardware, better EDGE computing – all working synergically.

Those leaps mean more powerful, more affordable solutions for data acquisition, asset monitoring and much more. It also keeps our engineering staff on their toes to see how InDro can further improve that technology for even more ambitious use-cases.

Sometimes, though, it’s nice to take a break and just have fun. And Niewinski’s creation certainly gave us – and, hopefully, you – a smile.

Spexi announces “Spexigon” – a global fly-to-earn platform

Spexi announces “Spexigon” – a global fly-to-earn platform

Vancouver-based Spexi Geospatial has some news – and it’s big.

The company has announced a plan, and a platform, to capture high-resolution aerial data of the earth with drones. Drone pilots will be able to fly to earn crypto currency – or even dollars.

The long-term goal? Well, picture crystal-clear data sets of cities, infrastructure, and even rural settings. With each individual pilot capturing data from different locations, Spexigon will assemble it over time to form a global jigsaw puzzle – and sell parts of that dataset to clients.

We’ll get into more details shortly, but Spexi’s plan has some strong backers – including InDro Robotics.

 

News release

 

News of Spexigon came in the form of an announcement. The company revealed it had secured $5.5 million USD in seed funding “to pursue our vision of collecting Earth’s most important data with drones.” The funding round was led by Blockchange Ventures, with other investing by InDro Robotics, Protocol Labs, Alliance DAO, FJ Labs, Dapper Labs, Vinny Lingham, Adam Jackson, and CyLon Ventures.

The same team that built Spexi – an easy-to use system for automated flight and data acquisition – is developing Spexigon. This brief video gives a “big picture” look at how it will work when it’s rolled out next year.

“Fly to earn”

 

A big part of what makes Spexigon’s plan so intriguing is what you might call incentivised crowd-sourcing. Anyone with a drone can download the forthcoming Spexigon app and fly an automated flight. The images will be uploaded to Spexigon to build the database – and the pilot will be rewarded.

“With our new Fly-to-Earn model, people who own consumer drones will be able to earn $SPEXI tokens and dollars while building a high resolution base layer of the earth,” reads the Spexigon announcement. “It is our hope that soon any organization or individual will be able to use the imagery collected by the Spexigon platform to make better decisions.”

 

Business model

 

You could think of this over time as like Google Earth, only with really sharp aerial imagery. Every time a pilot carries out a flight for Spexigon, that map will continue to fill in, building Spexigon’s database. Clients will purchase imagery online.

“This new base layer will enable governments and organizations of all sizes to make better decisions about real world assets like buildings, utilities, infrastructure, risk and natural resources, without requiring people on the ground,” continues the announcement.

“By using Spexigon, organizations that require high-resolution aerial imagery will no longer need to own their own drones or hire their own pilots. Instead, they’ll use our web and mobile app to search for and purchase imagery. Data buyers will then be able to use a variety of internal and external tools to put the imagery to use.”

 

For pilots

 

Spexigon says it will have online training when it launches. Pilots will learn how to use the app to carry out their flights – which, obviously, the pilots will monitor. Depending on the location, pilots can earn crypto currency or actual dollars. Some locations, obviously, will have greater value to Spexigon and its clients than others.

“The app will contain a map of the earth overlaid with hexagonal zones called ‘Spexigons’. Spexigons that are open and ready to fly will be easily visible so pilots can choose an area close to them and begin collecting imagery,” says the company.

“To ensure that imagery is captured in a safe, standardized, and repeatable way, our app controls each pilot’s drone automatically while they supervise the flight. Although our app will do the flying, pilots will always be in command and will have the ability to take back manual control at any time if need arises.”

Spexigon is now starting to build the app, and already has a small community emerging. You can join its Telegram channel here – and there’s also a Discord channel.

As for those ‘Spexigons’, the image below gives you an idea what those pieces of the puzzle might look like.

Spexi

InDro’s take

 

Since InDro Robotics is one of the backers of Spexigon, we obviously feel the plan is a good one.

It comes from the outstanding team that built Spexi from scratch into a user-friendly, automated system for capturing and crunching aerial data. We also believe drone pilots will embrace this unique “fly to earn” model – a global first.

“The Spexi team has already created an excellent and proven Software as a Solution product and clearly has the expertise in this space,” says InDro Robotics CEO Philip Reece. “I’m genuinely excited about the potential for Spexigon to become the ‘go-to’ database of high-quality aerial imagery from around the world.”

So are the rest of us.

Indro Robotics provides live drone video feed at Montreal Marathon in pilot medical project

Indro Robotics provides live drone video feed at Montreal Marathon in pilot medical project

By Scott Simmie

 

The Montreal Marathon, 2022 edition, was held over the weekend. The main event, the signature 42-kilometre run, took place early Sunday. And three InDro Robotics engineers were there.

They weren’t running, but were instead providing a live feed from drones. Those live feeds were being monitored on large video monitors by dedicated research assistants. They were assessing the quality of the feeds and their usefulness in detecting runners who might be in need of medical assistance.

Below: Team InDro, wearing safety vests, with Montreal Marathon runners on the right

Montreal Marathon

Research project

 

InDro became involved with this through Dr. Valérie Homier, an Emergency Physician at McGill University Health Centre. She has long had an interest in how drones can be used in the health care sector, and has collaborated with InDro on two previous research projects.

One of those projects evaluated whether drones or ground delivery could transport simulated blood products more efficiently to a trauma facility – the Montreal General Hospital. Drones were faster.

The second project studied whether drones could help identify swimmers in distress at an IRONMAN event in Mont-Tremblant. You can find that research here.

With the Montreal Marathon coming up, Dr. Homier knew there would likely be medical events. There generally are.

“In these long-distance sporting events there are usually some significant injuries, including cardiac events and heat strokes,” she says.

These tend to be more likely in the later phases of events like marathons, after the athlete has already been under stress for an extended time. The thinking was that perhaps drones could be a useful tool.

Dr. Homier was particularly interested in whether two drones in the air, covering two critical segments toward the end of the marathon, could provide useful data. Specifically, would the live video feed be consistent enough in quality and resolution to be a useful tool?

This pilot aimed to find out.

Below: An uphill segment near the Montreal Marathon finish line. This is was the target area for one of the Indro Robotics drones 

 

Montreal Marathon

InDro’s role

 

There was a lot of planning required for the mission to ensure the drones could provide continuous coverage and be safe for flying in an area with so many people. Project Manager Irina Saczuk (who happens to also be an RN) worked closely with Dr. Homier to help figure out the nuts and bolts of the InDro side of things.

InDro assigned three employees from the Area X.O facility to the project: Software developers Ella Hayashi and Kaiwen Xu, along with mechatronics specialist Liam Dwyer. All three hold Advanced RPAS certificates and took part in planning meetings to understand the mission and their roles. They also looked into optimising the drones’ video feeds to ensure the best quality would reach those monitoring remotely on large screens.

“At big-scale events such as this marathon, lots of people could go down with injuries,” says InDro’s Ella Hayashi. “But it can be hard to get timely support because roads are blocked. So drones have the potential to really help with sharing the precise location and other information when a person may need help.”

Worth noting here: The InDro engineers/pilots were not to be actively ‘looking’ for people in medical distress. Their role was simply to pilot the drones at the assigned locations and maintain a video feed that offered those watching the large-screen monitors with good situational awareness. In the event of an emergency, the pilots were to follow instructions, including moving in closer to a runner in distress.

 

Sub-250 grams

 

The team took four DJI Mini 2 drones to Montreal. Though InDro has a fleet of much larger and sophisticated drones the company has built, these consumer drones were perfect for the job. That’s because the Mini 2 is a sub-250 gram drone that can be flown near and over people. In the exceedingly rare event of a failure, the small device is unlikely to cause any substantial injury to someone on the ground. They’re also capable of very good video quality.

The team also used a third-party app – Airdata – to carry the video streams. The app created secure links for each drone’s feed that could be shared with those who would be monitoring the feed. Three drones were to be used in rotation so that two drones were always in the air providing live video at any given time. A fourth drone was onsite for backup.

“We modified the parameters and were streaming in 720p,” explains Dwyer. “We selected a lower resolution because on the bigger screen it didn’t have to be crystal clear but it needed to be smooth.”

There was, initially, some concern over whether the local LTE network would be able to handle the feed due to the large number of people using cellphones to capture and stream from the finish line.

“The night before the mission, a medical person told us there were going to be 20,000 people around the stadium,” says Xu. “We were worried about network connectivity, it was possible that our video streaming would not work. But actually the network was pretty good that day.”

Below is a drone selfie of the InDro team: From left to right, Kaiwen Xu, Ella Hayashi, Liam Dwyer

 

Live Drone Video Feed

A useful exercise

 

Remember: This was simply a pilot project to determine if drones could provide a clean video stream that might be useful. The pilots were to focus on hovering the drones in two specific adjacent locations, with some overlap in their video to ensure they were not missing a spot of this critical part of the marathon.

“Our job was 100 per cent flying the drones,” says Dwyer. “Just straightforward, wide-angle shots with all runners in the field of view.”

We should mention here that InDro also took part in a simulated cardiac event prior to the marathon reaching this area. A medical dummy was placed in a location and one of the drone pilots was instructed to get closer for a better look. A small electric vehicle – think a large golf cart adapted for First Responder use – was dispatched. Chest compressions were performed on the dummy, which was then loaded into the vehicle. A drone followed as the vehicle drove to a nearby stadium and the victim was transported inside to the treatment area. The feed gave others on the Medi-Drone team an opportunity to see, in real-time, the progress of the patient’s arrival.

“The drone response really gave them an active timeline of when they should expect to receive this patient,” says Dwyer.

So the drones proved useful during a simulation. But how would they perform with runners during the actual marathon?

Below: The downhill segment monitored by InDro Robotics

Montreal Marathon Drone Video

From simulation to real-world

 

As the lead runners came in, the field wasn’t crowded. But, of course, it would become more congested.

When athletes are moving together en masse like this, Dr. Homier says there’s a certain flow that can be observed from the drone. Because that flow is consistent and smooth, a runner in distress literally pops up as looking out of place.

And it happened. Those watching the live feed spotted someone who appeared to be in distress. They had stopped, were hanging on to a railing on the side of the course. Then they fell over the railing, dropping to the grass. A drone pilot was asked to move in for a closer look. It was clear this runner needed help.

In fact, while the pilots were intended to simply hover their drones, Dr. Homier had anticipated such a scenario, and built it into the protocol for the pilot project. Suddenly, an InDro pilot had become part of a First Responder team, providing much-needed situational awareness.

“It was embedded in the research protocol, that eyes on the event becomes what is required,” she explains. “It was called into dispatch and pilots were able to provide eyes on the incident. That was amazing; dispatch came down after and brought us a radio.”

 

Lessons learned

 

For Dr. Homier, there’s still work ahead and a lot of data to be analyzed.

“There’s a lot to learn from this project, and there’s a way forward for multiple surveillance methods,” she says.  “And the drones are way up there. The view from above when monitoring moving crowds is just incomparable.”

Plus, says Dr. Homier, the project sparked a tremendous amount of interest from other healthcare professionals on site.

“The interest was incredible, coming from the drone pilots, the students, the medical directors, the medical staff – they all thought it was so cool,” she says.

“We’re talking about 250 people involved in the medical team. Many came to see the viewing station, so in terms of letting people know about this new use of the technology – that was also a great success.”

Below: Mission accomplished! Team InDro is joined by key members of the marathon’s medical response team for this post-race drone selfie

Montreal Marathon

InDro’s take

 

We’re proud to be involved with this project – just as we’re proud to have collaborated previously with Dr. Valérie Homier on other research projects involving drones. In fact, we find this kind of research particularly meaningful.

“For us, using drones for good is much more than a catchy hashtag,” says InDro Robotics CEO Philip Reece. “Aerial and ground robots can perform so many useful tasks. We’ve helped securely deliver prescriptions to remote locations, COVID test supplies, and more. But playing a role in helping to ensure that someone in medical distress receives timely assistance is up near the top of the list. We look forward to the next project with Dr. Homier.”

And nice job, Ella, Kaiwen and Liam.

PS: We’ve issued a news release about this project. You can read it here.

 

 

InDro’s ROLL-E 2.0 robot delivers to London Drugs customers in Surrey, BC

InDro’s ROLL-E 2.0 robot delivers to London Drugs customers in Surrey, BC

There’s a new robot in town.

That robot is InDro’s ROLL-E 2.0, and the town (well, city) is Surrey BC.

In the latest phase of an ongoing pilot project with London Drugs, the new version of InDro’s delivery robot was on the job September 9, delivering curbside orders to customers for touchless and convenient delivery. ROLL-E even delivered from the store to a customer’s home.

Let’s take a look at this robot, which features a number of innovations its predecessor, the original ROLL-E, did not have.

Delivery Robot

A leap from ROLL-E 1.0

 

You might recall that Indro Robotics carried out a longer-term pilot project at a location of London Drugs in Victoria, BC. The robot carried out regular curbside deliveries for customers who ordered online and wanted a touchless pickup experience.

The original ROLL-E worked great, but we learned some lessons that have resulted in an even more user-friendly robot. As a result, ROLL-E 2.0 features a host of new features, including:

  • A total of six cameras, including two sets of depth perception cameras at the front and rear for greater situational awareness for the operator
  • LED running lights, signal lights, brake lights
  • Large cargo bay (50kg capacity) that can be opened and closed remotely
  • Greater all-weather protection and a touchscreen interface for customers

Here’s a quick look at ROLL-E 2.0 on the job:

Tele-operated

 

ROLL-E 2.0 is a tele-operated robot, with an operator controlling it over the 4G or 5G cellular networks. That means the ‘driver’ could be in the local London Drugs outlet, or even hundreds of kilometres away. The person operating sees the view from all cameras, GPS location, ROLL-E health – and more – over a computer console. ROLL-E 2.0 is not yet a fully autonomous machine, but does have the capability to eventually be thinking on its own.

The first deployment – for curbside pickups in Victoria – was popular with customers and London Drugs staff. Pushing the envelope with home deliveries was the next logical step.

“Customers were pleased with both the convenience and experience of having goods delivered to their car by robot,” says InDro Robotics CEO Philip Reece. “This took things further, both literally and figuratively. Delivery robots will one day become commonplace, so London Drugs and the City of Surrey are really ahead of the game.”

London Drugs, meanwhile, is interested in continuing to assess the efficiency and customer acceptance of robot deliveries as part of the future of e-commerce.

“Following a successful pilot debut for ROLL-E earlier this year, we are thrilled to be further exploring its capabilities as we test home delivery in conjunction with Indro Robotics and the City of Surrey,” says Nick Curalli, London Drugs vice president of technology solutions. “This is an important step for our company as we look for innovative ways to serve our customers in the safest and most convenient way.”

By the way, that’s InDro’s Kate Klassen on the left in the photo below. She was ROLL-E’s operator for this project. 

 

Delivery Robot

The Surrey connection

 

Surrey was an ideal test bed. The city welcomes projects like this as part of its Urban Technology Test Lab, which accelerates innovative projects toward commercialisation.

“Responding to the need for technology testing areas, the Urban Technology Test Lab Pilot provides technology firms with access to safe, local test zones,” said Surrey Mayor Doug McCallum. “Without the opportunity to field test in a real-world setting, many of the products could not proceed to final development and commercialisation. I am thrilled to see ‘ROLL-E 2.0’ hit Surrey streets for testing, and I am excited to see this initiative launch. The future truly does live here in the City of Surrey.”

ROLL-E 2.0 Surrey

InDro’s take

 

As an R&D company, InDro has always taken a “Crawl, walk, run” approach when it comes to testing new technology. The initial deployment in Victoria with ROLL-E 1.0 was the crawl phase, putting the product through its paces in a real-world setting. ROLL-E 2.0’s development and testing in Surrey was the our first chance to walk; we’ll soon be ready to run.

For InDro Robotics, this is about more than a business case. The eventual widespread adoption of robots like these will end countless short trips by automobile to nearby stores for small orders.

“Because ROLL-E 2.0 is electric, these deliveries will eliminate carbon emissions that would have otherwise been created by people driving to the store and back,” says InDro CEO Philip Reece. “This project involves a single robot, but deploying these at scale in the future will have a measurable impact on C02.”

We’ll check in again when ROLL-E 2.0 starts running.