Indro Robotics builds Street Smart Inspection Robot for safe cycling in winter

Indro Robotics builds Street Smart Inspection Robot for safe cycling in winter

By Scott Simmie

 

The safety of pedestrians and cyclists is important in any major city.

Bike lanes and crosswalks are the most obvious signs that we create infrastructure for this purpose. But there are limits to what that infrastructure can do.

Winter, for example, creates additional threats. Ice and snow are the most obvious problems, but potholes and deep puddles can also cause havoc – particularly for cyclists. Snow can accumulate quickly in a storm – and it’s not uncommon for snow plowed from the roadway to sometimes spill over into bike lanes (and even on sidewalks).

That’s part of the reason why Indro Robotics has built an innovative solution that will be put to the test this winter. It’s called the Street Smart Robot, or SSR – and it’s been designed from the ground up to help ensure safe winter cycling.

Below: A Wikimedia Commons image, taken in Whitehorse by Anthony DeLorenzo. Cyclists in Canada are a hardy bunch.

SSR Winter cyclist

STREET SMART ROBOT

 

The catalyst for the SSR is a research and development partnership fund called the Wintertech Development Program. According to the program website, Wintertech “supports Ontario small and medium enterprises (SMEs) and their partners to validate, test, prototype, and demonstrate new products and technologies designed to meet the unique demands of winter weather conditions.”

Wintertech is run by OVIN, the Ontario Vehicle Innovation Network. That’s a province of Ontario initiative which “capitalizes on the economic potential of advanced automotive technologies and smart mobility solutions such as connected and autonomous vehicles (CAVs), and electric and low-carbon vehicle technologies, while enabling the province’s transportation and infrastructure networks to plan for and adapt to this evolution.”

And while bicycles aren’t connected and autonomous vehicles (at least yet), we felt robotics could play a role in helping to ensure the safety of cyclists in the Smart City of the future. Specifically, we thought an autonomous robot equipped with the right sensors and processing might help to detect safety issues in bike lanes in the winter.

“The idea behind the robot is we want to prolong the use of bike lanes in Ottawa, but also ensure the safety of bike lanes in Ottawa,” explains Indro Robotics Account Executive Luke Corbeth.

“There’s really two parts to this: The first is a machine vision element to see if conditions are good enough for biking – no ice, not too many leaves, etc. On the safety side, the Street Smart Robot is more concerned with detecting things like potholes and cracks. And the idea is if you’re able to identify those things, the right resources can be deployed faster and more efficiently to solve the problem in a timely manner.”

So that’s the idea behind the SSR. Now, let’s take a look at the technology.

Below: A look at the specs of the AgileX Bunker Pro platform. We’ve upgraded the battery for a four-hour run time:

Bunker Pro Robot

BUILT TOUGH

 

If you’re going to be out and about in an Ottawa winter – where temperatures have reached as low as -33°C – you need a platform suitable for the environment. InDro selected the AgileX Bunker Pro as the starting point.

With an IP67 Ingress Protection rating, the Bunker Pro is highly impervious to water and particulate matter – which it will encounter in abundance on Ottawa streets. The treads give an edge over wheels when it comes to navigating over ice, snow and potholes. With a full charge, the platform can run for some four hours – even when it’s cold out.

But the platform is just the start. InDro engineers had to put considerable thought into the kinds of sensors that would help the Street Smart Robot carry out its task while safely avoiding cyclists, pedestrians, or other obstacles it might encounter. And that began with four wide-angle pinhole cameras.

“The pinholes are just to give the operator a better understanding of their environment and situational awareness,” explains Corbeth. And while we say “operator” here, it might be more appropriate to say “observer.” Though initial deployments will involve teleoperation, the SSR has been built to carry out its tasks autonomously. We anticipate that down the road (or bike path), a person will simply be involved in monitoring missions in case human intervention is required.

That autonomy will be carried out by a combination of computing power, our InDro Autonomy software stack, and a plethora of sensors.

 

SENSORS, SENSORS, SENSORS

 

The Street Smart Robot is equipped with front- and rear-facing depth cameras. These cameras have two visual sensors each, placed roughly the same distance apart as human eyes. That separation – as in human vision – allows the SSR to perceive the world in depth. Onboard computation (we’ll get to that) calculates how far obstacles might be in the robot’s immediate surroundings, and whether it needs to slow down, stop or change course to avoid obstacles.

The depth cameras are supplemented by two 270° 2D LiDAR sensors – which are devoted almost exclusively to obstacle avoidance and safety. There are two additional 3D LiDARS, placed to assist with Simultaneous Localization and Mapping, as well as a GPS and IMU (Inertial Measurement Unit). There’s even a Range Finder that can detect how far an object is off the ground.

“If you have an overhanging tree branch, for example, you could see where it is and see if it obscures the bike lane,” says Corbeth. “We’ve never used a Range Finder before, but could see it would be useful in this application.”

In addition, there’s a thermal/30X optical Pan-Tilt-Zoom camera, complete with a windshield wiper for those sloppy days.

Below: Check out what the SSR has on board.

Inspection Robot

SENSING, AND MAKING SENSE OF, ITS ENVIRONMENT

 

All of those sensors are great. But they don’t do a lot on their own. They require software and computational power to give their data meaning. Here, the SSR has a Jetson ORIN AGX for EDGE computing. The ROS2 operating system software lives onboard in InDro Commander – a module we’ve created for rapid integration of sensors and 5G remote operations. Commander can be thought of as the glue that holds everything together.

But still, there’s the issue of making sense of the data. How will the Street Smart Robot identify ice, water, leaves, snow, potholes, debris etc? And how will it determine whether they are significant enough to require a City of Ottawa maintenance crew to take action?

Here, the SSR will be going to school. And by that, we mean a combination of Machine Vision and Machine Learning will teach it what’s worth reporting and what isn’t. One way is to take actual images you expect to see on the bike path and identify which ones are significant and which can be ignored.

“For example, you could mount a camera on an actual bicycle and then use that dataset to train the robot,” says Corbeth. “But there are a couple of approaches. There’s also obviously the simulation element. You can use specific engines that have high-fidelity graphics to simulate possible different environments in high resolution. And you can use that simulated data for training.”

 

THEN WHAT?

 

In the early stages, there will be a human operator at the helm. They’ll likely see some of those obstacles with the pinhole cameras, but from the start we anticipate that the onboard AI will trigger an alert for the operator. Once the operator confirms that the image is worthy of the attention of a maintenance crew, they will contact the city directly.

That’s how things will start. Once we’ve established a high confidence level that the Machine Vision is correctly identifying anomalies, that process can be automated. The SSR will be programmed to automatically send an alert to the city, complete with a captured image of the potential problem and GPS coordinates.

“It’s the idea of anomaly detection – detecting and flagging anomalies. The SSR will also likely be able to identify the severity of the anomaly on a one-to-five scale. And because you have the GPS onboard, the exact location of the problem can be relayed to the city.”

Below: The SSR, during its public unveiling at the TCXpo exhibition in June. Front and rear view, to show off all its bits…

 

Street Smart Robot

ON THE SHOULDERS OF SENTINEL

 

Every technical achievement owes something to its predecessors. And that’s no different with the Street Smart Robot.

InDro Robotics previously built Sentinel, a workhorse for remote asset inspection. That robot incorporated an AgileX Bunker platform, an InDro Commander integration module for sensors and remote teleoperations, and several other sensors. It’s a great robot that has proven itself in the most demanding conditions.

The Street Smart Robot builds on the success and learnings of Sentinel with additional compute power, many more sensors, a more robust design and extended operational range. And that opens the SSR up to many other tasks beyond bicycle paths.

“That’s what really excites me about this project – the idea that this technology is transferable across other inspection verticals,” says Corbeth. “This particular robot could be used in agriculture, and the track platform is really good in mining. I could envision this exact same robot in those two verticals, and others.”

 

THE 5G CONNECTION

 

When dealing with multiple sensors in real-time, a solid data bandwidth is crucial. Here, SSR is outfitted with a high-speed modem and special antennaes for optimizing 5G and 4G signals. And while the Street Smart Robot can be operated over 4G, it’s 5G that offers the pipeline necessary for the data to really flow. InDro’s continuing partnership with Rogers will ensure that the SSR has access to Canada’s largest and most reliable 5G network.

Both InDro and Rogers are contributing a significant amount of development money as part of the OVIN Wintertech program. The total value of the project is estimated at $1,395,000. InDro Robotics and Rogers Communications are contributing twothirds of the total project value ($780,000 and $150,000). Support from the Government of Ontario, through OVIN, totals $465,000 or one-third of the project. 

Below: Innovations in the InDro Sentinel paved the way for the Street Smart Robot. This image was taken at the opening of the Drone and Advanced Robot Training and Testing facility (DARTT) in June of 2023. That’s Luke Corbeth holding the microphone.

Sentinel water DARTT

INDRO’S TAKE

 

We’re pretty excited about the SSR. As Luke Corbeth says: “It’s a bad-ass robot.”

And it is. But we’re also excited about the project itself.

The Smart City of the future will have connected, embedded sensors (including on robots) helping with everything from traffic flow to updated weather and road conditions. Cyclists are a big part of any major urban centre, and the prospect of helping to ensure their safety appeals to us. It’s also an opportunity to really hone in on Machine Vision and Machine Learning as InDro trains the SSR for its task.

“Smart Cities are on their way, and it’s great to be involved in yet another project as we build the capacity for the future,” says InDro Robotics CEO Philip Reece.

“The Street Smart Robot has an important role to play, and could well be the template for anomaly detection and road maintenance at scale in the future. We’re grateful to OVIN and its Wintertech program for selecting this project for development; we’re excited to see what the SSR can do.”

InDro plans to start testing the SSR during the winter, with the project stretching to June of 2024. We’ll be writing more once the robot hits the bike paths.

Assistive devices on the rise at Korea’s Robot World Conference

Assistive devices on the rise at Korea’s Robot World Conference

By Scott Simmie

A major robotics conference is underway in Seoul, South Korea.

Robot World 2023 features some 200 exhibitors and 700 booths, ranging all the way from heavy hitters like Hyundai (which makes robots for industrial purposes) through to companies that manufacture the various widgets that make up the robot supply chain. There are manufacturers of wheels, servos, end effectors, lubricants, cable management systems – you name it, you’ll find it.

Need a hand? There’s no shortage of robotic arms. While many are suited for factory and warehouse work, others are destined for the food services industry. Turn a corner and you’re more likely than not to see an arm smoothly pouring a coffee, grabbing a soft drink or snack and presenting it to an attendee.

Below: A Hyundai robot that can lift and reposition autonomobiles. It’s part of the Hyundai WIA (World Industrial Ace) division.

USE-CASES

 

The robots at this show illustrate the many use-cases. There are welding robots, pick-and-place machines, and heavy-lift AMRs (Autonomous Mobile Robots) that can lift more than a ton. Need something stacked, sorted, inspected, delivered? Want a manipulator arm you can program to start preparing French fries the moment a take-out order has been placed by an app? Need a robot to move a car?

At Robot World 2023, you’ll find all of the above – and more.

 

ASSISTIVE DEVICES

 

But there was another category of robot on display at the exhibition: Assistive medical devices. Specifically, very smart machines that can be used for patients requiring rehabilitation.

InDro Robotics, which was invited to attend Robot World 2023, was struck by the number of companies with products in this sector. There were ground robots – friendly-looking devices that keep an eye on vulnerable people and can call for assistance if there’s a fall or some other crisis. But more intriguing to us were machines that can play a role – both physiologically and psychologically – in helping to rehabilitate someone from a serious injury or other challenging condition.

Below: A shape-shifting wheelchair wheel can climb stairs

Robot World Microsurgery

RE-LEARNING TO WALK

 

Like any major convention, exhibitors range from established global companies like Hyundai all the way to smaller startups with a great idea. And one that caught our attention is a company called Astrek Innovations. Its CEO and co-Founder is Robin Kanattu, a young engineer from Kerala in southern India.

“We are mainly focussing on building and designing products for the 20 per cent of people who are suffering from disability and accessibility issues,” says Robin. “One of the products is the lower limb exoskeleton, for people who are suffering form lower limb disabilities.”

As the company’s website explains:

“Established in 2018, we develop cutting edge solutions to some of our most complex problems – Disability and Rehabilitation. Leveraging our knowledge and expertise in robotics, machine learning and motion capture, we design devices that would transform the current state-of-the-art in the rehabilitation and assistive technology arena.

“Our magnum opus is a wearable robotic device, an exoskeleton, that would help people with lower-limb immobility walk again. A culmination of motorised limb braces, motion capturing & tracking; and machine learning; this device would transform rehabilitation into a precise, immediate treatment protocol.”

Established in 2018, the company has been building and testing versions of this product for four years.

“Now we have a final version, and we wanted to provide independence for people who are suffering form these disabilities,” he says.

A lot of research has gone into this product. Robin says a great deal of groundwork was spent capturing data on healthy people: How they walk, how they sit, how gaits alter during the course of a stride.

“Now we use that same data to predict the walking pattern of users, so they will have much more stable walking and standing while using the device.”

The exoskeleton provides support and strength and moves the legs. Forward-facing crutches are used to aid in stability. The product can be used on someone who is paralysed from the waist down, people recovering from strokes, those with certain genetic issues and people recovering from accidents.

 

HOW THE IDEA WAS BORN

 

Robin is an electrical engineer. But there was a personal motivation to put his skills to use in this arena.

“My grandfather had this issue. After having an accident, he was not able to walk properly. And after doing knee replacement surgery he was not able to walk again,” he explains. “So that’s how our team came togeher.”

Astrek has been recognized for product excellence at Robot World 2023, and Korea has brought the company in on a program called the K Startup Grand Challenge. Robin has been working in Korea on streamlining the manufacturing chain, working with mentors and looking for collaboration.

But the product, he says, is fully functioning. And people who are paralyzed from the waist down have been able to walk with it.

“Psychologically, they are so happy,” he says. “Their sole dream is to walk again, and we are happy to see them doing that.”

Robin did not have the prototype at the show because of red tape involving flying the batteries to Seoul. He’s pictured below with a banner showing the device.

Robin Kanattu Astrek

ROBOT REHABILITATION

 

Another company, RpiO, has already cracked the market. Its R-BoT plus is a device designed for people with central nervous system damage (including stroke, paraplegia, spinal cord injury etc.). It’s more of a rehab device designed for hospital settings, but allows users to exercise lower extremities while lying down or standing upright. The product is approved by a Korean regulatory body (Korean Ministry of Food and Drug Safey, formerly the KFDA), and the company has already sold seven units inside Korea.

“We have major hospitals, locals hospitals and private hospitals who are using the machine with people who have damage impacting their lower body,” explains CEO Jay Moh.

“Because KFDA is a standard in Southeast Asia, we are starting to sell in Hong Kong, Malaysia and Singapore. Many doctors have come to see our robots.”

The R-BoT plus works in three modes: Passive, active and resistive – depending on the patient’s abilities. What sets this device apart is that the person exercising watches a large-screen display during rehabilitation sessions. The display features outdoor scenes, and with every ‘step’ made, a footprint appears on the ground and the patient has a visual cue that they’re making progress. Distance covered, calories burned and heart rate are all displayed as well, providing further incentive.

“Once the machine starts, they look at the display,” he says. “This has been medically tested; this stimulates the brain and releases a chemical that stimulates recovery. People feel better – they enjoy the workout and feel like they’re walking through the grass.”

For those ready to actually move in the real world, the company also has a product called EXOwalk. Here, an exoskeleton is strapped to the patient’s limbs and can help move their legs (again, in multiple modes). But this exoskeleton is fixed to a rolling robotic platform – meaning the patient actually moves forward on the ground, rather than being fixed to a static machine.

“This is driven – so they actually move along the hallway in the facility.”

 

EXO Motion

 

For patients with upper limb motor impairments, the company has developed a product called EXO motion. This is strictly a portable exoskeleton device that attaches to the arm. In active mode, it detects myolectric signals from the user’s arm and – with some sophisticated algorithms and mechatronics – converts those signals into mechanical motion that moves the arm.

In addition to these robotic devices, RpiO also is a leading company in software designed to help people with dementia.

“We have a high population of elderly people who suffer with this,” says Moh. “So the market is growing very fast.”

Below: CEO Jay Moh, followed by the R-BoT plus and display. Note the footprints…

 

Robot World Korea R-BoT plus
R-BoT plus display

INDRO’S TAKE

 

We enjoyed checking out these devices at Robot World 2023 – and were pleased to see yet more evidence of #robotsforgood.

“Robots can be tremendous tools on their own,” says InDro Robotics CEO Philip Reece. “But there’s something truly special about products designed to directly help human beings improve their mobility and health. We applaud the inventors and engineers who develop these products, and look forward to even more assistive device breakthroughs in future.”

And a final note: The feature image at the top of this story shows some very, very, tiny arms used for microsurgery. InDro was able to take a run at the controls (pictured below). It took some patience, but we were able to grasp an impossibly small elastic band.

Now picture a highly skilled microsurgeon operating on someone remotely.

It’s happening now, thanks to robotics.

Robot World Microsurgery
Engineers put skills to the test in F1tenth autonomous challenge

Engineers put skills to the test in F1tenth autonomous challenge

By Scott Simmie

 

Want to win a scale model car race?

Normally you’d pimp your ride, slam the throttle to the max, and do your best at the steering control to overtake any opponents while staying on the track.

Now imagine a race where no one is controlling the car remotely. Where, in fact, the car is driving itself  – using sensors and algorithms to detect the course, avoid obstacles, and look continuously for the most efficient path to the finish line.

That’s the concept of F1TENTH, a regular competition held at major robotics conferences. The latest contest was carried out in Detroit at IROS 2023, the International Conference on Intelligent Robots and Systems. The contest brings together researchers, engineers, and autonomous systems enthusiasts.

“It’s about Formula racing, but on a smaller scale – and it’s autonomous,” explains Hongrui (Billy) Zheng, a University of Pennsylvania PhD in electrical engineering, and a key organizer of the F1TENTH series.

And what does it take to win?

“I would say 90 per cent software, and 10 per cent hardware,” says Zheng.

And that means it’s more about brainpower than horsepower.

Before we dive in, check out one of the cars below:

F1tenth

A LEVEL PLAYING FIELD

 

To keep things truly competitive, all teams begin with the same basic platform. They can either build that platform, based on the build guides at F1TENTH.org, or purchase the platform. The price of the vehicle, which this year incorporated a 2D LiDAR unit (which makes up the bulk of the cost), is about $2500-$2800 US.

“I would say 60 per cent is spent on the LiDAR,” says Zheng. “Some teams use a camera only, and that drives it down to around $1000.”

So it’s a lot more accessible – and a lot safer – than real Formula 1. And instead of high octane fuel, the teams are more concerned with powerful algorithms.

Once again, the basic Open-source Robot Operating System autonomy and obstacle avoidance software is part of the basic package that all teams start out with. But just as real F1 teams work together to extract every ounce of performance, so too do the F1TENTH teams, which usually represent universities but are occasionally sponsored by companies. At this year’s competition six of the nine teams were from universities.

The F1TENTH organization says there are four pillars to its overall mission. Here they are, taken directly:

1. Build – We designed and maintain the F1TENTH Autonomous Vehicle System, a powerful and versatile open-source platform for autonomous systems research and education.

2. Learn – We create courses that teach the foundations of autonomy but also emphasize the analytical skills to recognize and reason about situations with moral content in the design of autonomous.

3. Race – We bring our international community together by holding a number of autonomous race car competitions each year where teams from all around the world gather to compete.

4. Research – Our platform is powerful and versatile enough to be used for a variety of research that includes and is not limited to autonomous racing, reinforcement learning, robotics, communication systems, and much more.

In other words, there are real-world applications to all of this. Plus, for engineers, it’s not that difficult to dive in.

“The entire project is Open Source,” explains competitor Po-Jen Wang, a computer engineer from the University of California Santa Cruz. “It uses a Jetson Xavier (for compute). And for perception it uses a Hokuyo 2D LiDAR. Some people will mount a camera for computer vision. You can make it by yourself – it’s very easy to make.”

The following video provides a good introduction to the competition. In actual races, a piece of cardboard – sometimes modified for aerodynamics – is affixed to the rear of the car. These are to aid other vehicles on the track with obstacle avoidance.

 

PIMP THAT RIDE

 

Okay. So you’ve got your basic build, along with the basic ROS software.

Now it’s time to get to work. Engineers will add or modify algorithms for obstacle avoidance, acceleration, braking – as well as for determining the most efficient and optimal path. Depending on their approach, some teams will plot waypoints for the specific course.

Of course, like a real F1 race, a lot of modifications take place once teams are at the track. But in the case of F1tenth, those alterations tend to be code (though we’ll get to mechanical changes in a moment). Of course, scrolling through endless lines of programming isn’t the most efficient way to detect and eliminate bugs or improve efficiency. This is particularly true since multiple types of software are involved.

“There is software for SLAM (Simultaneous Localization and Mapping) for the mapping part, there’s software for localisation, there’s software for basic tracking if you give it a waypoint,” says organizer Billy Zheng. “Some of the basic drivers are found in a repository on Github.

“Most of the good teams are very consistent, and most of the consistent ones use mappingand localisation. The second place winner this year was using a reactive method – you just drop it and it will work.”

With all those moving parts, many teams use a dashboard that displays multiple parameters in real-time as the car moves down the track. This allows them to more rapidly nail down areas where performance can be optimised.

“The good teams usually have a better visualisation setup, so it’s easier to debug what’s going on,” adds Zheng. “The good teams are using Foxglove – a spinoff from an autonomous driving company that created a dashboard for ROS.”

To get a better idea of what the engineers are seeing trackside, here’s a look at Foxglove in action during F1TENTH.

MECHANICALS

 

Though it’s 90 per cent about code, that’s not all.

“Some modify their vehicles in different ways, maybe make it more aerodynamic, change the wheels,” explains competitor Tejas Agarwal, a graduate of uPenn with a Masters in Robotics. Agarwal and Po-Jen Wang were both contracted by Japanese self-driving software company/foundation Autoware.

(As it turned out, Wang and Agarwal placed second and third, respectively.)

The wheels on the stock vehicles are more suited to pavement and dirt rather than indoors tracks, so wheels are a common modification. But this year’s winning team, from Université Laval, took it further.

“We lowered the centre of mass as much as possible, changed the wheels, and changed our motor for better control,” says Laval team leader Jean-Michel Fortin, a PhD student in computer science specialising in robotics.

Of course, they weren’t allowed to increase the power of the motor in order to keep things on an even playing field. But they wanted one that offered greater control at lower speeds.

“Usually at low speeds the (stock) motor is bad, so we changed that for a sensor equipped motor,” says Fortin.

“We also replaced our suspension because it was too soft. As soon as we were braking our LiDAR wasn’t seeing what it should. For the software part, we tuned everything to the maximum that we could. We also optimised the race line to make sure the race line that we predict is as close to what the car can do as possible.”

And it paid off. The Laval team, pictured below, was clearly in a celebratory mood after winning (Jean-Michel Fortin in centre). Following is second-place winner Po-Jen Wang, third-place winner Tejas Agarwal and organizer Billy Zheng.

 

Laval F1tenth
Po-Jen F1tenth
Billy F1tenth

INDRO’S TAKE

 

Competitions – particularly ones like this one – are highly useful. They foster collaborative teams and encourage innovative thinking. Plus, they’re just plain fun.

“F1TENTH is a tremendous initiative and a really great challenge for young engineers and autonomy enthusiasts,” says InDro Robotics CEO Philip Reece. “Those participating today could well be leaders in the autonomy sector tomorrow. We congratulate all who took part, with a special nod to the top three. Well done!”

Is there a similar engineering challenge you think is worth some words from us? Feel free to contact InDro’s Chief of Content Scott Simmie here.

And, if you’re a competitor beginning a job search, feel free to drop us a line with your resume here. InDro Robotics is Canada’s leading R&D aerial and ground robotics company and in a current phase of scaling. We’re always on the lookout to expand our talented and diverse engineering team.

Indro Robotics takes in IROS 2023 in Detroit

Indro Robotics takes in IROS 2023 in Detroit

By Scott Simmie

 

One of the most important gatherings in the field of robotics is underway in Detroit.

It’s the International Conference on Intelligent Robots and Systems, or IROS 2023. And InDro Robotics is there.

“IROS is kind of an open forum to discuss research in the fields of mobile robotics, manipulation and so much more,” says Account Executive Luke Corbeth. “It gives researchers the ability to collaborate with each other, as well as industry, through the exhibits.”

Or, as the conference describes itself: “IROS is a large and impactful forum for the international robotics research community to explore the frontier of science and technology in intelligent robots and smart machines, emphasising future directions and the latest approaches, designs and outcomes.”

There’s plenty to see (and learn). You’ll find robotic arms and hands – some with incredible dexterity. There are quadrupeds, bipeds, specialised sensors – even a race course where teams put small but fast autonomous racers against one another. Plus, of course, scores of seminars and poster exhibits highlighting new and important research in fields ranging from AI to remote microsurgery.

“Everyone who is working on the cutting edge of robotics comes to IROS to present their research,” says Corbeth.

Some of the best minds in the field – including Masters and PhD students from many parts of the world – come to learn, network and share. Even Amazon is here, specifically to hire people to design, build and operate new robots for its warehouses. So too is the Honda Research Institute.

WHAT IS ROS?

 

Though IROS stands for Intelligent Robots and Systems, “ROS” has another relevant meaning. In the industry, it stands for Robot Operating System. As ros.org describes it, ROS “is a set of software libraries and tools that help you build robot applications.”

These libraries and developer tools include state-of-the-art Open Source algorithms that are shared with developers around the planet. The original toolkit is known as ROS1, while the newer ROS2 has more robust security protocols and is being embraced at the corporate and industrial levels.

“Generally what is being built here is being built on ROS,” explains InDro Vice President Peter King. He goes on to explain that you can think of ROS as a facilitator that brings all the different parts of a robot – including different sensors and coding – together.

“ROS is language-agnostic,” says King. “You can bring in Python, you can bring in C++, you can bring in other sensors. ROS allows all of the packages to talk to each other.”

In some ways, that’s also what InDro Robotics does. As both a research and development company and an integrator, InDro frequently brings together disparate parts for a common purpose – most often, for special projects for clients.

“Everybody here is actually the perfect client for InDro,” says King. 

“Imagine you were studying autonomy and perception and you’re going to do this in ROS. These students and universities don’t have the budget or hardware or time to build what they need. So we can build a custom robot, generally outfitted with InDro Commander, so they can focus on simply coding their project.”

“It’s a very big international community – which I was not expecting,” adds Account Executive Amanda Gloor. “Plus, it’s great to see people showcasing technology from all over the world. One of the cooler things I saw was a robot that climbs storage tanks using magnets – then uses non-destructive testing to detect corrosion.”

Below: InDro Account Executive Amanda Gloor gets the Unitree GO2, which InDro distributes, to take a leap

POSTER EXHIBITS

 

If you’ve got the time (and the brains), the rotating poster exhibits are fascinating to dip into. There are some 1200 exhibitors either displaying their research or holding seminars. Some of that research could be the Next Big Thing, or a significant incremental advance that will be utilised in other applications.

A quick spin through just a few of the exhibits, during a session devoted to healthcare, revealed the following topics:

  • A shared autonomous nursing robot assistant with dynamic workspace for versatile mobile manipulation
  • Magnetic, modular, undulatory robot: Exploring fish-inspired swimming for advanced underwater locomotion and robotics
  • Contactless weight estimation of human body and body parts for safe robotics-assisted casualty extraction

As you can see, some are highly specialised. Now think of hundreds (and hundreds) of such research papers, each making a small (or even large) contribution to pushing the robotics envelope. That’s IROS.

But while such important niche research was in abundance, there was also a sense that the Big Picture moving forward involves AI. While that’s always been a part of the robotics world, recent advances in artificial intelligence, machine learning and machine vision took centre stage. Many of the keynotes – and smaller learning sessions – focussed on AI.

Wednesday’s plenary session, for example, was “Merging Paths: The shared history and convergent future of AI and Robotics.” One of the keynotes was “Deep predictive learning in Robotics: Optimizing models for adaptive perception and action” – followed by: “Empowering robots with continuous space and time representations.” Those are in addition to scores of separate sessions during the conference with an emphasis on AI. 

Instead of robots simply being aware of their surroundings and tasks, we appear to be heading into a world where these machines more fully understand the world around them, and make decisions based on that understanding. And that feels like a very big deal.

 

 

INDRO’S TAKE

 

There are always conferences going on in the robotics and UAV sectors. We could choose to attend all of them, but we tend to be selective.

For the academic and R&D world, IROS is the venue where we can learn about the latest cutting-edge research and technology – and display our own innovations (such as our new ROS-based indoor UAV, which has been gaining a lot of attention). So it’s good to be here again.

“The unique thing about InDro is our ability to have a conversation with virtually everyone at this conference,” says Luke Corbeth. “Given the scope of our work – whether it’s a new platform, or sensors, integration or production, there’s always some way we can be of value to those across the R&D community.”

It’s also a great place to meet the next generation of engineers and other specialists, some of whom may one day join the growing InDro team. 

Drones playing increasing role in disaster response: CAV Canada Panel

Drones playing increasing role in disaster response: CAV Canada Panel

By Scott Simmie

 

In recent years, drones have proven indispensable in the field of emergency services.

They’re routinely used to assess damage following disasters, to document serious accidents and allow roads to re-open sooner, for situational awareness during firefighting operations, Search and Rescue operations – and much more.

So as we head into a future of Smart Mobility and Smart Cities, it’s fair to assume that the role of drones will continue to grow. And that was the thrust of a panel at CAV Canada in Ottawa September 28 entitled “Aerial First Responders: Drones transforming emergency services.”

Moderated by InDro Robotics CEO Philip Reece, the panel brought together experts from the world of drones, EMT, AI/Machine Learning – and more.

 

Philip CAV Canada Drone Panel

THE EVENT

 

CAV Canada is an annual gathering devoted to the field of Connected and Autonomous Vehicles. And drones are very much a part of that sector.

Down the road, it’s anticipated that automated drone deliveries of critical supplies – including medicines and even organs for transplant – will be routine in major urban centres. The US Federal Aviation Administration is already talking about setting aside specific corridors for use by UAVs to help ensure they do not conflict with traditional crewed aircraft. So that connected, autonomous future is coming – and emergency response will be part of that world.

The panel included experts from various specialties within the drone world. Those participating were:

  • Wade MacPherson, an Advanced Care Paramedic with the County of Renfrew and drone operator
  • Sharon Rossmark, CEO of Women and Drones and a commercial aircraft pilot
  • Dr. Robin Murphy, Raytheon Professor of Computer Science and Engineering at Texas A&M University and a specialist in drones and disaster response. (Dr. Murphy was involved with deploying drones following Hurricane Katrina in New Orleans back in 2005; the first use of drones in a US disaster scenario.)
  • Jason Chow, Director of Strategy and Business Development with Elroy Air. The company is manufacturing an automated delivery aircraft that can carry 300 pounds of cargo in a quickly swappable pod
  • Mathieu Lemay, CEO and Co-Founder of Lemay.ai and AuditMap.ai – and an authority on Artificial Intelligence and Machine Learning

Below: The panel. Philip Reece is on the far left; the other panel members appear in order above from L-R

Philip CAV Canada Drone Panel

GROWING USE OF DRONES

 

When it comes to emergency response, there’s no question that drones are now firmly part of the tool kit. And lately, it seems, there’s no shortage of disasters.

“Unfortunately we’re seeing more and more wildfires, more earthquakes, more floods – even tornadoes,” said Reece as he kicked off the session. Paramedic Wade Macpherson said it’s routine to deploy drones in his line of work.

MacPherson said his paramedic organization has eight drones that are used regularly. They’ve been used to deliver prescription medications during floods, in Search and Rescue missions, and for situational awareness. Not only can drones gather data or deliver critical medications, said MacPherson, but they also help keep other professionals out of harm’s way. He sees great potential for their use in delivering Automated External Defibrillators, which are used to help cardiac arrest patients. Research in Renfrew County has shown that a drone can deliver an AED unit faster than a speeding paramedic vehicle.

AEDs by drone, he said “could be an enormous game-changer…time is absolutely critical.” In fact, the odds an untreated cardiac patient will survive diminish by 10 per cent each subsequent minute following the event.

Recently, said MacPherson, the Renfrew paramedics were called to assist in locating a missing Canadian Forces helicopter that had crashed. And again, drones were deployed.

 

ELROY AIR

 

Most drones deployed in emergency response situations are smaller machines – with the smallest weighing just under 250 grams. While such machines can still prove useful for Search and Rescue and situational awareness, a growing number of companies are manufacturing larger uncrewed vehicles capable of greater range and cargo. Elroy Air is one of those companies.

“Our sweet spot is 300 pounds (cargo) and 300 miles (range),” said Jason Chow. Because the Elroy Air vehicle is a fixed-wing VTOL, it takes off and lands like a helicopter – meaning it doesn’t require a runway. Its cargo pod can be rapidly switched out. Chow says carrying humanitarian supplies and disaster relief are among the use-cases for such aircraft.

“(The aircraft can carry out) Search and Rescue, monitoring wildfires,” said Chow. “But the main one for us is the cargo pods, being able to go from a supply depot and move the different kinds of supplies the firefighters need to potentially dangerous areas where you don’t want helicopters flying.”

Using drones, he says, takes the risk and cost out of the equation. Medical supplies, food, water – even fuel or batteries – can be carried in those pods.

Below: The Elroy Air Chaparral, with cargo pod

 

Elroy Air Chaparral

AI, Machine Learning, Autonomy

 

Where things get really interesting is when you start layering in enhanced capabilities such as AI, Machine Learning, and autonomous flights.

Systems such as SkyScoutAi are capable of being automatically dispatched the moment AI detects the beginning of a wildfire. Data about the location and intensity of the burn can be quickly relayed to emergency responders. In other words, there’s a human “on the loop” – rather than someone manually operating the aircraft via remote control. It’s faster, more efficient, and should lead to earlier detection and mitigation.

The Elroy Air system also involves automated flights – and the company is exploring automation for loading the cargo pods. In a natural disaster or emergency, this would also mean that critical goods get to the required destination more quickly.

“We want to be able to prepackage all the cargo into these cargo pods so that you don’t have to be there in a dangerous environment,” said Chow. “That’s what we’re thinking about, extending the capabilities and reducing risk.”

The potential for AI appeals to paramedic MacPherson. He explained that while he’s confident about his paramedic skills, he doesn’t have the same proficiency when it comes to drones. An automated flight path for search and rescue operations, he said, would be more efficient than a paramedic manually operating the craft. “I’m an expert in paramedics and ultrasound, but not at all the latest drone techniques,” he said, adding that using AI to optimise the search path would be useful.

There was agreement elsewhere on the panel.

“It’s all about getting the right information to the right person at the right time,” said Dr. Robin Murphy. “How do you get it to them?…So AI’s got a huge role to play.”

 

TRAINING

 

There was also recognition that emergency response requires specialised skills. In the early days, it was enough to simply know how to pilot a drone. Not anymore.

“A lot of people think it’s about learning to fly the drone,” observed Sharon Rosemark of Women and Drones.

“What’s missing is the specific applications and expertise…So really helping people understand that the drone is a tool, but within that there are other applications and other opportunities.”

Below: An InDro Wayfinder drone, which has been used in trials for prescription drug delivery to remote locations

Delivery Drone Canada

INDRO’S TAKE

 

InDro has long been involved with drones (and now robots!) and emergency response. We’ve carried out prescription drug deliveries, Automated External Defibrillator trials, and even shuttled COVID test supplies for an isolated First Nations community at the peak of the crisis. We’ve seen, first-hand, just how valuable these tools can be.

“There’s no question that drones and robots have become essential tools for First Responders,” says InDro Robotics CEO Philip Reece. “It’s also pretty clear that their utility will continue to grow. AI and automation will add both to their value – and to the number of applicable use-cases. We look forward to helping to push the envelope.”

A final FYI: InDro has carried out specialised drone training for First Responders for many years. We are now able to expand that training to include ground robots at the Drone and Advanced Robotics Training and Testing facility at Area X.O in Ottawa (which also features a huge, netted enclosure for drone training and evaluation). If you’re interested, please contact us here.

InDro Robotics, Tallysman partner on precision GNSS solution for ground robots

InDro Robotics, Tallysman partner on precision GNSS solution for ground robots

By Scott Simmie

 

There’s nothing like synergy.

And a new high-precision solution for location and heading – a collaboration between Indro Robotics and Tallysman Wireless – is the result.

As you likely know, while GPS is great – and good enough for us driving around using WAZE – its accuracy can leave something to be desired. Traditional Global Navigation Satellite System (GNSS) solutions generally are accurate to around 2.5 metres Circular Error Probability (CEP). That means the reported location has a 50 per cent chance of being within 2.5 metres of where it actually is. What’s more, that error rate exists under ideal conditions – with the detection system stationary and with an unobstructed view of satellites.

That’s good enough for a car with a driver, or a cargo ship. But many robotic applications require far greater precision. If you’re running a remote inspection robot using GPS waypoints, 2.5 metres isn’t good enough.

Think about it. In the example of a remote inspection robot, the device is generally operating at the location of a high-value asset. You might want a repeatable routine where the robot can get up close to look at gauges, valves, or anything else requiring inspection. You might want it to pass through a doorway, or get very close to – without touching – a highly energised electrical component. So accuracy matters, whether in this application or many other use-cases.

Now, InDro and Tallysman Wireless, a Calian company, are pleased to announce a solution.

Below: An InDro Sentinel inspection robot

Inspection Robot

LOCATION, LOCATION, LOCATION

 

That real estate phrase certainly applies when it comes to high-level robotics. But how do you get from a potential error of 2.5 metres down to, say, 2.5 centimetres – two orders of magnitude?

In this case, the solution came by partnering with Tallysman Wireless.

The company is known for its leading-edge GNSS and Iridium antennas. Tallysman also has an enviable reputation for customising those antennas for global clients seeking specific GNSS solutions. Ships, aircraft, trains and even drones carry out critical missions daily while relying on Tallysman solutions. InDro, meanwhile, is known for its R&D work, robots and custom innovations in the aerial and ground robotics world. So the potential was there for collaboration.

Because – and we know this well – it’s not just as simple as plugging antennas and receivers together and hoping for the best.

Tallysman InDro Backpack

THE ISSUE(S)

 

Antennas – even really good ones – are finicky pieces of equipment. Depending on what you’re mounting them on, they can be subject to interference that diminishes their performance. The type of connector attaching them to a receiver, even the length of the cable used in that connection, can also detract from optimal signal acquisition and accuracy. That’s why you want someone like Tallysman Wireless on board.

“We are experts in Global Navigation Satellite System (GNSS) solutions, specifically the antennae side of  things,” explains Gord Echlin, Tallysman’s Director of Business Development.

“A lot of people assume, unless you use the term GPS, that it’s a fairly easy thing to implement and get accurate results. Nothing could be farther from the truth.”

As Tallysman Wireless explains in a brochure:

“The problem is that keeping the antenna to system interface in the analog domain requires a lot of RF expertise to manage, expertise that is not widely available, and these issues amplify with long signal transmission over cables. Digital systems with built-in LTE communication links, common components of autonomous systems, are direct threats to the integrity of the GNSS signal.

 “Tallysman Wireless has solved this problem by integrating their sensitive, high performance, GNSS antennas in the same package as advanced GNSS receivers, in what is commonly called a ‘Smart’ Antenna. The GNSS processing solution is in compact, carefully engineered to mitigate the potential impairments between the antenna and receiver, and the Position/Navigation/Timing (PNT) information is now communicated to application system over in the digital domain, over a serial interface (UART, USB, CANbus, or Automotive Ethernet).”

Or, as Echlin puts it: “We take the receiver, and we take the antenna and put it in the same package to mitigate the outside interference such that you get less than two centimetres of error. That, along with very precise heading information – with accuracy to 0.3 degree.”

THE INDRO CONNECTION

 

With the Tallysman Wireless integrated solution, there was just one more piece of the puzzle remaining. How to integrate this into a robot? That’s where InDro’s engineering team came along.

The package required a software interface where none had existed before. InDro engineers created a ROS2 coding solution and integrated the Tallysman smart antenna onto our InDro Backpack – which we use with Unitree quadrupeds.

InDro Backpack is our system for remote teleoperations over 5G (and 4G), which also allows for rapid sensor integration and other customization using ROS1 and ROS2 software libraries (which, along with an EDGE computer and high-speed modem, are on-board).

The complete solution – hardware and software – enable consistent, high-accuracy positioning. InDro Plans to offer this complete solution to clients in need of precision positioning.

Below: The Tallysman Smart Antenna solution, integrated with new ROS2 coding into the InDro Backpack

InDro Backpack Tallysman

INDRO’S TAKE

 

We were pleased to partner with Tallysman Wireless on this integration project. With robots increasingly used for remote inspection of high-value assets, accurate positioning and heading data has become essential.

“Being able to tap into the expertise of Tallysman Wireless – and combine their solution with software from InDro engineers – has resulted in a powerful solution,” says InDro Robotics CEO Philip Reece.

“We’ve already begun integrating this onto our own robots, and look forward to offering this to clients in need of the most accurate and reliable positioning possible.”

Looking for more information? Connect with us HERE. And if you happen to be attending TCXpo, the solution is on display September 27 at Area X.O in Ottawa.