InDro Robotics ROS-based drone an R&D powerhouse

InDro Robotics ROS-based drone an R&D powerhouse

By Scott Simmie

 

InDro Robotics is pleased to unveil details of its highly capable new R&D drone.

Running the Robot Operating System (ROS) and with powerful onboard compute capabilities, the drone is perfect for advanced Research and Development.

“It’s a drone geared toward R&D first and foremost,” explains Luke Corbeth, Head of R&D Sales. “It truly is a flying robot – and you can program and use it in a very similar fashion to all our other robots.”

There’s a real demand in the research world for open-source drones that can be programmed and run highly complex algorithms. These kinds of drones can be used to study swarm behaviour, object detection and identification, mapping in GPS-denied locations and much more.

For some researchers, the budget go-to has been the Crazyflie, a micro-drone that uses a Raspberry Pi for compute. Its advantage is that it’s quite affordable. But its low cost, 27 gram weight and relatively low computing power means it has limitations – including the ability add sensors of any weight.

“This drone can do so much more,” says Corbeth. “With the NVIDIA Xavier NX onboard for compute, it can effectively map entire environments. And when it comes to landing and object recognition, it’s truly phenomenal. It can even land on a moving vehicle.”

Below: A look at InDro’s new drone, which comes complete with LiDAR, a depth-perception camera, 5G connectivity – and much more.

InDro ROS drone

THE BACK STORY

 

If you’ve been following the latest news from InDro, you’ll be aware we have an incubation agreement with Cypher Robotics. That company builds solutions for cycle counting and precision scanning in the industrial/supply chain space. InDro assisted with the development of its signature product, Captis.

Captis integrates an autonomous ground robot with a tethered drone. As the Captis robot autonomously navigates even narrow stock aisles, the drone ascends from a tether attached to that ground robot. The drone then scans the barcodes (it’s code-agnostic) of the products on the shelves. All of that data is transferred seamlessly, in real-time, to the client’s Warehouse Management System (WMS), WCS (Warehouse Control System) and WES (Warehouse Execution System) software.

The capabilities of Captis led to a partnership with global AI fulfilment experts GreyOrange and leading global telco innovator Ericsson. The product debuted at the recent MODEX2024 conference (one of the biggies in the automated supply chain world), where it gained a *lot* of attention.

While working on the project, it was always clear the drone – thanks to multiple modifications – would be highly suitable as a research and development tool. It’s capable of machine vision/object recognition, machine learning, and can find its way around in completely unfamiliar, GPS-denied environments.

“In fact, I have one client that’s using it for research in mines,” says Corbeth.

 

THE JETSON DIFFERENCE

 

NVIDIA has made quite a name for itself – and quite a profit for its shareholders – with its powerful AI-capable processors. The Jetson Xavier NX features a 6-core NVIDIA Carmel Arm®v8.2 64-bit processor running at speeds of up to 1.9 GHz. Its graphics processor unit features a 384-core NVIDIA Volta™ architecture with 48 Tensor Cores. Put it all together, and the computing power is astonishing: The Xavier NX is rated with a maximum achievable output of 21 TOPS – trillion operations per second. (We were going to try to count, but thought it more efficient to rely on NVIDIA’s specs for this.)

The LiDAR unit currently shipping with the drone also has some flex. It’s the Ouster 32-channel OS1 (Rev6.2). With a maximum range of 200 metres (90 metres on a dark, 10 per cent target), its powerful L3 chip is capable of processing scans of up to 5.2 million points per second with 128 channels of vertical resolution (again, we didn’t count). Hostile environment? No problem. The LiDAR can operate from -40°C to 60°C and has an IP68 Ingress Protection rating.

The OS1 is designed for all-weather environments and use in industrial automation, autonomous vehicles, mapping, smart infrastructure, and robotics,” states its manufacturer“The OS1 offers clean, dense data across its entire field of view for accurate perception and crisp detail in industrial, automotive, robotics, and mapping applications.”

The unit uses open source ROS and C++ drivers, and comes with Ouster’s Software Development Kit. Its ability to accurately sense its environment (down to distances of 0.5 metres away), combined with the NVIDIA processor and the depth camera also allows this machine to do something pretty extraordinary: It can recognise and land on a moving platform.

“That’s a very challenging problem to solve and requires not only specific sensing but also really powerful onboard compute. This drone can do it,” explains Corbeth.

Already, word about the product has been spreading. A number of units have already been sold to academic institutions for research purposes – and the team has been hard at work building and testing for the next set of orders (as seen below).

THE FORGE CONNECTION

 

Like all new products, the new drone required custom parts. We looked no further than InDro Forge, our rapid prototyping and limited production run facility in Ottawa.

Using state of the art additive and subtractive tools, the Forge team created custom mounts using carbon fibre and other strong but lightweight materials, while also ensuring the frame was robust enough to take on even the most challenging environments where these drones will be deployed.

“InDro Forge has been critical to the finished product,” says Corbeth. “We wanted a look, feel and quality that matches this drone’s capabilities – and InDro Forge delivered.”

InDro ROS drone

INDRO’S TAKE

 

We’re obviously excited about the capabilities of this new drone, and we’re not alone. Interest in this product from researchers has already been significant. In fact, we’re not aware of any other drone on the market offering this combination of specific capabilities.

It was that void – in concert with our partnership with Cypher Robotics – that led to its creation.

“InDro has always placed a great emphasis on the development of innovative new products,” says CEO Philip Reece. “We build new products at the request of clients and also develop our own when we see a market opportunity. In this case, the requirements for Cypher Robotics dovetailed nicely with demand for such a drone from researchers.”

Production of the new drone is moving at a swift pace. If you’re interested in a briefing or demo, you can contact us here.

QUEBEC’S HAPLY ROBOTICS MAKES THE VIRTUAL FEEL REAL

QUEBEC’S HAPLY ROBOTICS MAKES THE VIRTUAL FEEL REAL

By Scott Simmie

 

Odds are you’ve heard of remote surgery by now.

That’s where a surgeon, looking at screens that provide incredibly detailed 3D video in realtime, conducts the operation using a controller for each hand. The inputs on those controllers are translated into scaled-down movement of robotic arms fitted with the appropriate medical devices. The robotic arms are capable of moving a precise fraction of the distance of the operators’ hands. As a result, these systems allow for far greater control, particularly during really fine or delicate procedures. 

The surgeon might be at a console in the operating theatre where the patient is. Or they could be operating on someone remotely. You could have a specialist in Montreal perform an operation on someone elsewhere in the world – providing you’ve got a speedy data connection.

The video below does a really good job of explaining how one of the best-known systems works. 

 

THE IMPORTANCE OF FEEL

 

Conducting standard surgery (or a variety of other tasks) without robots involves constant tactile feedback.  If a doctor is moving an instrument through tissue – or even probing inside an ear – they can feel what’s going on. Think of cutting a piece of fruit; you adjust the pressure on the knife depending on how easy the fruit is to slice. When you put a spoon into a bowl of jello, that constant feedback from the utensil helps inform how hard or soft you need to push.

This tactile feedback is very much a part of our everyday lives – whether it’s brushing your teeth or realising there’s a knot in your hair while combing it. Even when you scratch an itch, you’re making use of this feedback to determine the appropriate pressure and movements (though you have the additional data reaching your brain from the spot being scratched).

But how do you train someone to perform delicate operations like surgery – even bomb defusal – via robotics? How do you give them an accurate, tactile feel for what’s happening at the business end? How much pressure is required to snip a wire, or to stitch up a surgical opening?

That’s where a company from Quebec called Haply Robotics comes in.

“Haply Robotics builds force-feedback haptic controllers that are used to add the sense of touch to VR experiences, and to robotic control,” explains Product Manager Jessica Henry. “That means that our controller sits on the human interface side and lets the human actually use their hand to do a task that is conveyed to a robot that’s performing that task.”

We met some of the Haply Robotics team during the fall at the IROS 2023 conference in Detroit. We had an opportunity for a hands-on experience, and were impressed.

 

INVERSE3

 

That’s the name of Haply’s core product.

“The Inverse3 is the only haptic interface on the market that has been specially designed to be compact, lightweight, and completely portable,” says the company’s website. “Wireless tool tracking enables you to move freely through virtual environments, while our quick tool change mechanism allows you to easily connect and swap VR controllers, replica instruments, and other tools to leverage the Inverse3’s unmatched power and precision for next-generation force-feedback control.

“The Inverse3 replicates tactile sensory input required for simulating technical tasks. It can precisely emulate complex sensations like cutting into tissue or drilling into bone – empowering students, surgeons, and other healthcare professionals to hone and perfect medical interventions before ever performing them in the clinical environment.”

Haply Robotics has produced an excellent video that gives you both a look at the product – and how it works:

 

WHAT DOES IT FEEL LIKE?

 

While at IROS, we had a chance to put our hands on the Inverse3.

In one of the simulations (which you’ll see shortly), the objective was to push a small sphere through a virtual gelatin-like substance. As you start pushing the ball against that barrier, you begin to feel resistance through the handle of the Inverse3. Using force-feedback, you continue to push and feel that resistance increase. Finally, when you’ve hit precisely the correct amount of pressure, the ball passes through the gelatin. The sensation, which included a satisfying, almost liquid ‘pop’ as the ball passed through, was amazing. It felt exactly like you would have anticipated it would feel with a real-world object.

“Touch adds a more information as opposed to just having the visual information,” explains Henry. “You also have the tactile information, so you have a rich amount of information for your brain to make a decision. You can even introduce different haptic boundaries so you can use things like AI in order to add some kind of safety measure. If the AI can say ‘don’t go there’ – it can force your hand out of the boundary with haptic cues. So it’s not just visual, it’s not just audio.”

 

SIMULATION, TRAINING…AND MORE

 

The Inverse3 is already in use for simulation training in the medical industry. In fact, many existing devices for robotic surgery do not have haptics – and there’s clearly a demand.

“Robotic surgical consoles don’t use haptics yet, and we’re hearing that surgeons are asking for that to be added because it’s missing that sense,” says Henry. “A mistake they can make is to push an instrument too far in because it’s just visual. If you had haptics on your handles, you would intuitively know to pull back.”

Remember how we tried pushing a virtual object through a gel-like substance? You’ll see that in this video around the :24 mark:

THE HAPLY STORY

 

Well, it’s not the entire Haply Robotics story, but here it is in a nutshell.

The idea for the product – for the need for such a product – first surfaced in 2016. The three co-founders were working on haptic devices at Canada’s National Research Council. Existing devices then were large and tended to not have the greatest user experience. They saw an opportunity to create something better. The company has been in business since 2018 – with these three at the helm:

  • Colin Gallacher (MEng, MSc, President)
  • Steve Ding (MEng, Electrical lead)
  • Felix Desourdy (BEng, Mechanical lead)

The trio put their heads together and – a lot of R&D later – produced the Inverse3.

The company manufactures the physical product, which contains three motors to provide haptic feedback. Haply Robotics also makes an API, but the coding for the simulations comes from outside partners. Fundamental VR, for example, is a company devoted to developing virtual training simulations for everything from opthamology to endovascular procedures. It coded that gelatin simulation.

“Studies confirm that VR significantly improves the effectiveness of medical education programs. Adding real haptics increases accuracy and delivers full skills transfer,” says the Fundamental VR website. In fact, it cites research showing a 44 per cent improvement in surgical accuracy when haptics are part of the VR experience.

“In the training space, when you’re using it for simulation, a surgeon’s work is very tactile and dexterous,” says Haply’s Jessica Henry. “We enable them to train using those instruments with the proper weights, the proper forces, that they’d encounter in surgery as opposed to textbooks or cadavers. It’s a more enriched way of interacting.”

And it really, really feels real.

Below: Haply’s Jessica Henry manipulates the Inverse3

 

 

Haply Robotics Jessica

INDRO’S TAKE

 

It’s always great discovering another new company in the robotics field, particularly one with an innovative solution like the Inverse3. It’s also great when these companies are Canadian.

“Haply Robotics has identified a clear void in the marketplace and created a solution,” says Indro Robotics CEO Philip Reece. “With the growth in remote robotics – not just surgery – I can see a wide range of use-cases for the Inverse3. Congratulations to the Haply team on being ahead of the curve.”

For more info on the product, check out the Haply Robotics website.

InDro Commander module streamlines robotics R&D

InDro Commander module streamlines robotics R&D

By Scott Simmie

 

Building robots is hard.

Even if you start with a manufactured platform for locomotion (very common in the case of ground robots), the work ahead can be challenging and time-consuming. How many sensors will require a power supply and data routing? What EDGE processing is needed? How will a remote operator interface with the machine? What coding will allow everything to work in unison and ensure the best data and performance possible? How will data be transmitted or stored?

That’s the hard stuff, which inevitably requires a fair bit of time and effort.

It’s that hurdle – one faced by pretty much everyone in the robotics R&D world – that led to the creation of InDro Commander.

InDro Commander

WHAT INDRO COMMANDER DOES

 

InDro Commander is a platform-agnostic module that can bolt on to pretty much any means of locomotion. In the photo above, it’s the box mounted on top of the AgileX bunker (just above the InDro logo).

Commander is, as this webpage explains, “a single box with critical software and hardware designed to simplify payload integration and enable turn-key teleoperations.” Whether you’re adding LiDAR, thermal sensors, RTK, Pan-Tilt-Zoom cameras – or pretty much any other kind of sensor – Commander takes the pain out of integration.

The module offers multiple USB inputs for sensors, allowing developers to decide on a mounting location and then simply plug them in. A powerful Jetson EDGE computer handles onboard compute functions. The complete Robot Operating System software libraries (ROS1 and ROS2) are bundled in, allowing developers to quickly access the code needed for various sensors and functions.

“Our engineering team came up with the concept of the InDro Commander after integrating and customizing our own robots,” says Philip Reece, CEO of InDro Robotics. “We realized there were hurdles common to all of them – so we designed and produced a solution. Commander vastly simplifies turning a platform into a fully functioning robot.”

Account Executive Luke Corbeth takes it further:

“The Commander serves as a “brain-box” for any UGV,” he says. “It safely houses the compute, connectivity, cameras, sensors and other hardware in an IP54 enclosure.”

It also comes in several options, depending on the client’s requirements.

“There are three ‘standard versions’ which are bundles to either be Compute Ready, Teleoperations Ready or Autonomy Ready,” adds Corbeth.

“I’ve realized over time that the value of Commander is our ability to customize it to include, or more importantly, not include specific components depending on the needs of the project and what the client already has available. In reality, most Commanders I sell include some, but not usually all, of what’s in the Commander Navigate. We’re also able to customize to specific needs or payloads.”

Below: Commander comes in multiple configurations

InDro Commander

COMMANDER DOES THE WORK

 

With InDro Commander, developers can spend more time on their actual project or research – and far less time on the build.

“For end-users wanting a fully customized robot, Commander saves a huge amount of time and hassle,” says InDro Engineering Lead Arron Griffiths. “Customers using this module see immediate benefits for sensor integration, and the web-based console for remote operations provides streaming, real-time data. Commander also supports wireless charging, which is a huge bonus for remote operations.”

Commander serves as the brains for several InDro ground robots, including Sentinel. This machine was recently put through its paces over 5G in a test for EPRI, the Electric Power Research Institute.

 

5G OPERATIONS

 

Depending on the model, Commander can also serve as a Plug & Play device for operations over 4G or 5G networks. In fact, InDro was invited by US carrier T-Mobile to a 2022 event in Washington State. There, we demonstrated the live, remote tele-operation of a Sentinel inspection robot.

Using a simple Xbox controller plugged into a laptop at T-Mobile HQ in Bellevue WA, we operated a Sentinel in Ottawa – more than 4,000 kilometres away. There was no perceptible lag, and even untrained operators were able to easily control remote operations and cycle between the Pan Tilt Zoom camera, a thermal sensor, and a wide-angle camera used for situational awareness by the operator. Data from all sensors was displayed on the dashboard, with the ability for the operator to easily cycle between them.

Below: T-Mobile’s John Saw, Executive Vice President, Advanced & Emerging Technologies, talks about InDro Commander-enabled robots teleoperating over 5G networks 

 

FUTURE-PROOF

 

Platforms change. Needs evolve. New sensors hit the market.

With Commander on board, developers don’t need to start from scratch. The modular design enables end-users to seamlessly upgrade platforms down the road by simply unbolting Commander and affixing it to the new set of wheels (or treads).

Below: Any sensor, including LiDAR, can be quickly integrated with InDro Commander

Teleoperated Robots

INDRO’S TAKE

 

You likely know the saying: “Necessity if the mother of invention.”

InDro developed this product because we could see its utility – both for our own R&D, and for clients. We’ve put Commander to use on multiple custom InDro robots, with many more to come. (We have even created a version of this for Enterprise drones.)

On the commercial side, our clients have really benefited from the inherent modularity that the Commander provides,” says Luke Corbeth.

“Since the ‘brains’ are separate from the ‘body,’ this simplifies their ability to make the inevitable repairs or upgrades they’ll require. These clients generally care about having a high functioning robot reliably completing a repetitive task, and Commander allows us to operate and program our robots to do this.”

It can also save developers money.

“On the R&D side, the customizable nature of the Commander means they only purchase what they don’t already have,” adds Corbeth.

“For instance, many clients are fortunate enough to have some hardware already available to them whether it’s a special camera, LiDAR or a Jetson so we can support the integration of their existing systems remotely or they can send this hardware directly to us. This cuts down lead times and helps us work within our clients’ budgets as we build towards the dream robot for their project.”

Still have questions or want to learn more? You can get in touch with Luke Corbeth here.

uPenn robotics team cleans up at SICK LiDAR competition

uPenn robotics team cleans up at SICK LiDAR competition

By Scott Simmie

 

There’s nothing we like more than success stories – especially when technology is involved.

So we’re pleased to share news that a team of bright young engineers from the University of Pennsylvania were the winners of a prestigious competition sponsored by SICK, the German-based manufacturer of LiDAR sensors and industrial process automation technology.

The competition is called the SICK TiM $10K Challenge. The competition involves finding innovative new uses for the company’s TiM-P 2D LiDAR sensor. Laser-based LiDAR sensors scan the surrounding environment in real-time, producing highly accurate point clouds/maps. Paired with machine vision and AI, LiDAR can be used to detect objects – and even avoid them.

And that’s a pretty handy feature if your robot happens to an autonomous garbage collector. We asked Sharon Shaji, one of five UPenn team members (all of whom earned their Masters in Robotics this year), for the micro-elevator pitch:

“It’s an autonomous waste collection robot that can be used specifically for cleaning outdoor spaces,” she says.

And though autonomous, it obviously didn’t build itself.

Below: Members of the team during work on the project.

uPenn Sauberbot

THE COMPETITION

 

When SICK announced the contest, it had a very simple criteria: “The teams will be challenged to solve a problem, create a solution, and bring a new application that utilizes the SICK scanner in any industry.”

SICK received applications from universities across the United States. It then whittled those down to 20 submissions it felt had real potential, and supplied those teams with the TiM-P 270 LiDAR sensor free of charge.

Five students affiliated with UPenn’s prestigious General Robotics, Automation, Sensing and Perception Laboratory, or GRASP Lab, put in a team application. It was one of three GRASP lab teams that would receive sensors from SICK.

That Lab is described here as “an interdisciplinary academic and research center within the School of Engineering and Applied Sciences at the University of Pennsylvania. Founded in 1979, the GRASP Lab is a premier robotics incubator that fosters collaboration between students, research staff and faculty focusing on fundamental research in vision, perception, control systems, automation, and machine learning.”

Before we get to building the robot, how do you go about building a team? Do you just put smart people together – or is there a strategy? In this case, there was.

“One thing we all kept in mind when we were looking for teammates was that we wanted someone from every field of engineering,” explains Shaji. In other words, a multidisciplinary team.

“So we have people from the mechanical engineering background, electrical engineering background, computer science background, software background. We were easily able to delegate work to every person. I think that was important in the success of the product. And we all knew each other, so it was like working with best friends.”

 

GENESIS

 

And how did the idea come about?

Well, says the team (all five of whom hopped on a video call with InDro Robotics), they noticed a problem in need of a solution. Quite frequently on campus – and particularly after events – they’d noticed that the green space was littered. Cans, bottles, wrappers – you name it.

They also noticed that crews would be dispatched to clean everything up. And while that did get the job done, it wasn’t perhaps the most efficient way of tackling the problem. Nor was it glamorous work. It was arguably a dirty and dull job – one of the perfect types of tasks for a robot to take on.

“Large groups of people were coming in and manually picking up this litter,” says Shaji.

“And we realised that automation was the right way to solve that problem. It’s unhygienic, there are sanitation concerns, and physically exhausting. Robots don’t get tired, they don’t get exhausted…we thought this was the best use-case and to move forward with.”

Below: Working on the mechanical side of things

uPenn SICK Sauberbot

GETTING STARTED

 

You’d think, with engineers, the first step in this project would have been to kick around design concepts. But the team focussed initially on market research. Were there similar products out there already? Would there be a demand for such a device? How frequently were crews dispatched for these cleanups? How long, on average, does it take humans to carry out the task? How many people are generally involved? Those kinds of questions.

After that process, they began discussing the nuts and bolts. One of the big questions here was: How should the device go about collecting garbage? Specifically, how should it get the garbage off the ground?

“Cleaning outdoor spaces can vary, because outdoor spaces can vary,” says team member Aadith Kumar. “You might have sandy terrain, you might have open parks, you might have uneven terrain. And each of these pose their own problems. Having a vacuum system on a beach area isn’t going to work because you’re going to collect a lot of sand. The vision is to have a modular mechanism.”

A modular design means flexibility: Different pickup mechanisms would be swappable for specific environments without requiring an entirely new robot. A vacuum system might work well in one setting, a system with the ability to individually pick items of trash might work better somewhere else.

The team decided their initial prototype should focus on open park space. And once that decision was made, it became clear that a brush mechanism, which would sweep the garbage from the grass into a collection box, would be the best solution for this initial iteration.

“We considered vacuum, we considered picking it up, we considered targeted suction,” says Kumar. “But at the end of the day, for economics, it needed to be efficient, fast, nothing too complicated. And the brush mechanism is tried and tested.”

Below: Work on the brush mechanism

 

 

uPenn SICK Sauberbot

SAUBERBOT

 

The team decided to call its robot the SauberBOT. “Sauber” is the German word for “clean”. But that sweeping brush mechanism would be just one part of the puzzle. Other areas to be tackled included:

  • Depth perception camera for identifying trash to be picked up
  • LiDAR programmed so that obstacles, including people, could be avoided
  • Autonomy within a geofenced location – ie, the boundaries of the park to be cleaned

There was more, of course, but one of the most important pieces of the puzzle was the robotic platform itself: The means of locomotion. And that’s where InDro Robotics comes in.

 

THE INDRO CONNECTION

 

Some team members had met InDro Account Executive Luke Corbeth earlier in the year, at the IEEE International Conference on Robotics and Automation, held in Philadelphia in 2022. Corbeth had some robotic platforms from AgileX – which InDro distributes in North America – at the show. At the time the conference took place, the SICK competition wasn’t yet underway. But the students remembered Corbeth – and vice versa.

Once the team formed and entered the contest, discussions with InDro began around potential platforms.

The team was initially leaning toward the AgileX Bunker – a really tough platform that operates with treads, much like a tank. At first glance, those treads seemed like the ideal form of locomotion because they can operate on many different surfaces.

But Luke steered them in a different direction, toward the (less-expensive) Scout 2.0.

“He was the one who suggested the Scout 2.0,” says Udayagiri.

“We actually were thinking of going for the Bunker – but he understood that for our use-case the Scout 2.0 was a better robot. And it was very easy to work with the Scout.”

Corbeth also passed along the metal box that houses the InDro Commander. This enabled the team to save more time (and potential hassle) by housing all of their internal components in an IP-rated enclosure.

“I wanted to help them protect their hardware in an outdoor environment,” he says. “They had a tight budget, and UPenn is a pretty prominent robotics program in the US.”

But buying from InDro begs the question: Why not build their own? A team of five roboticists would surely be able to design and build something like that, right? Well, yes. But they knew they were going to have plenty of work on their own without having to build something from scratch. Taking this on would divert them from their core R&D tasks.

“We knew we would do it in a month or two,” says the team’s Rithwik Udayagiri. “But that would have left us with less time for market research and actually integrating our product, which is the pickup mechanism. We would have been spending too much time on building a platform. So that’s why we went with a standalone platform.”

It took a little longer than planned to get the recently released Scout 2.0 in the hands of the UPenn team. But because of communication with Luke (along with the InDro-supplied use of the Gazebo robot simulation platform), the team was able to quickly integrate the rest of the system with Scout 2.0 soon after it arrived.

“The entire project was ROS-based (Robot Operating System software), and they used our simulation tools, mainly Gazebo, to start working on autonomy,” explains Corbeth. “Even though it took time to get them the unit, they were ready to integrate their tech and get it out in the field very quickly. That was the one thing that blew me away was how quickly they put it together.”

It wasn’t long before SauberBOT was a reality. The team produced a video for its final submission to SICK. The SauberBOT team took first place, winning $10,000 plus an upcoming trip to Germany, where they’ll visit SICK headquarters.

Oh, and SauberBOT? The team says it cleans three times more quickly than using a typical human crew. 

Here’s the video.

 

A CO-BOT, NOT A ROBOT

 

Team SauberBOT knows some people are wary of robots. Some believe they will simply replace human positions and put people out of work.

That’s not the view of these engineers. They see SauberBOT – and other machines like it – as a way of helping to relieve people from boring, physically demanding and even dangerous tasks. They also point out that there’s a labour shortage, particularly in this sector.

“The cleaning industry is understaffed,” reads a note sent by the team. “We choose to introduce automation to the repetitive and mundane aspects of the cleaning industry in an attempt do the tasks that there aren’t enough humans to do.”
 
 
And what about potential jobs losses?
 
 
“We intend to make robots that aren’t aimed to replace humans,” they write.
 
 
“We want to equip the cleaning staff with the tools to handle the mundane part of cleaning outdoor spaces and therefore allow the workforce to target their attention to the more nuanced parts of cleaning which demand human attention.”
 
In other words, think of SauberBOT as a co-operative robot meant to assist but not replace humans. These are sometimes called “co-bots.” 
 
 
Below: Testing out the SauberBOT in the field
UPenn SICK SauberBOT

INDRO’S TAKE

 

We’re obviously pleased to have played a small role in the success of the UPenn team. And while we often service very large clients – including building products on contract for some global tech giants – there’s a unique satisfaction that comes from this kind of relationship.

“It’s very gratifying,” says Corbeth. “In fact, it’s the essence of what I try to do: Enable others to build really cool robots.”

The SauberBOT is indeed pretty cool. And InDro will be keeping an eye on what these young engineers do next.

“The engineering grads of today are tomorrow’s startup CEOs and CTOs,” says InDro Robotics Founder/CEO Philip Reece.

“We love seeing this kind of entrepreneurial spirit, where great ideas and skills lead to the development of new products and processes. In a way, it’s similar to what InDro does on a larger scale. Well done, Team SauberBOT – there’s plenty of potential here for a product down the road.”

If you’ve got a project that could use a robotic platform – or any other engineering challenge that taps into InDro’s expertise with ground robots, drones and remote teleoperations – feel free to get in touch with Luke Corbeth here.

Area X.O unveils new simulation portal

Area X.O unveils new simulation portal

By Scott Simmie

 

Area X.O, the Ottawa facility founded and operated by Invest Ottawa that houses cutting-edge companies involved in robotics and smart mobility R&D, has unveiled a powerful new tool.

It’s a simulation portal that will allow firms to virtually test products under development. Want to put a robot through its paces on the roads at Area X.O to evaluate its propulsion system and battery life? Have a drone overfly and capture data? Maybe you want to test in snow and cold temperatures, despite it being summertime?

Unless you happen to be an Area X.O tenant, carrying out any of these tasks in real life would involve getting permission, getting your product to the site – even waiting for months and taking multiple trips if you wanted to test under a variety of weather conditions. The costs on this would quickly add up, and your development time would stretch.

With the new simulator, you can put your robot or drone (or sensor) through their paces remotely – whether you’re in Ottawa, Vancouver, or even further afield. And you can use the data gathered in the simulator to improve and refine your real-world product.

“Until recently, Area X.O was limited to the physical world,” said Patrick Kenny, Senior Director of Marketing and Communications for Invest Ottawa, Area X.O and Bayview Yards.

“This past winter, Area X.O launched a simulation discovery portal powered by Ansys. The simulation portal and program promotes simulation and its ability to reduce time, cost, effort and risk by getting breakthrough innovations to market faster. Innovators now have a new option to consider.”

Kenny made his remarks during a June 7 webinar. During that event, Area X.O engineers Barry Stoute and Hossain Samei explained how the system works – and even carried out a real-time demonstration.

 

Area X.O simulation portal

POWERED BY ANSYS

 

The brains behind the system come from Ansys, which has been in the simulation software business for more than 50 years. It is widely considered to be the most powerful software of its kind.

“Simulation is an artificial representation of a physical model,” explained simulation engineer Dr. Stoute. He went on to explain, at a high level, two different types of simulation: Finite Element Analysis (FEA) and Digital Mission Engineering.

In a nutshell, FEA uses software (and really good computers) to see how different models behave under different conditions. The model can be anything: A robot, an antenna, a drone – you name it.

“Finite Element Analysis solves for mechanical structures, thermal analysis, electronics and optical (components),” explained Dr. Stoute. Want to know what temperature a component might heat to under load? Determine how a transmitter or antennae might behave in differing temperatures? Even “see” what an optical sensor might capture when mounted on a robot? Plug in the right parameters and powerful computing will give the answer.

 

DIGITAL MISSION ENGINEERING

 

This type of simulation is a way of designing a complex system, particularly where multiple assets interact with another in a simulated environment. In the example seen below, Dr. Stoute says a digital mission engineer could create a model where a drone capturing data interacts with multiple objects. These include satellite communications, a ground station, along with multiple vehicles. The drone’s mission is to capture data from the ground, but the engineer is interested in seeing the Big Picture – the ways in which all these different assets will interact.

The mission engineer can select and modify the parameters of every asset in that model. How powerful is the ground station and what range will it provide? What speed is the aircraft flying at, and at what altitude. What type of aircraft is it? What sensors are on the drone and what are their specifications? What is the battery life? What are the specifications of the drone’s motors? The ambient temperature and wind conditions?

The options are dizzying. But the software – along with a well-trained mission engineer – can create a virtual world where the data outcomes closely predict what would happen in a real-world mission.

“If an engineer creates a physical product and it doesn’t work as planned, they have to go back and remodel it,” explained Dr. Stoute. The simulation environment, by contrast, allows the engineer to tweak that product in a virtual environment without the expense of real-world modifications. Once the product is working well in simulation, those learnings can be applied to the actual physical product.

Plus, of course, weather parameters can easily be changed; something impossible in real-world testing (unless you’ve got lots of time on your hands).

“Should he wait until January to get a blizzard to test the product?” asked Dr. Stoute.

“No, it doesn’t make sense. The simulator can simulate blizzard conditions.”

 

Below: Dr. Stoute explains how Digital Mission Engineering works during the webinar

 

Digital Mission Engineering

REAL-TIME DEMONSTRATION

 

Now that the basics were explained, the webinar moved on to demonstrate these concepts. Area X.O engineer Hossain Samei took over the controls, doing a real-time demo of the sim’s capabilities.

For this, Samei used not only the Ansys core system, but another powerful piece of software called Ansys AVxcelerate, which is used to test and validate sensors for self-driving cars. That means you can plug in virtual sensors, including all of their technical parameters, into the system. And not simply the sensors on the cars. In this simulation, which features a very high-resolution 3D map of the Area X.O complex, Hossain also had sensors that are on the Area X.O site embedded into this virtual world.

“This digital twin also includes the infrastructure embedded into our smart city zone,” explained Samei. “This includes multiple sensors, optical cameras, roadside units, thermal cameras and LiDAR cameras.” The model even includes functioning railroad crossing gates.

“We’re able to simulate the arms moving up and down,” he said.

And remember how the Ansys system can simulate weather? The mission engineer can also tailor lighting conditions – very useful for testing visual sensors.

 

VIRTUAL TEST DRIVE

 

Samei already had the digital twin of Area X.O defined. He then quickly put together an autonomous vehicle and camera sensor using AVxcelerate.

“Once we have our car defined, as well as the sensors on the vehicle, we’re able to move on to choosing a car simulator,” said Hossain.

In order to help the car drive on Area X.O’s terrain, Hossain turned to the Open-Source Webots robot simulator.

“With WeBots, you can define your vehicle, including its suspension, power train and other features to define the vehicle dynamics of the car,” said Samei.

And now? It was time for a drive.

Samei began to pilot the car around Area X.O – showing as well that he could change the setting from a clear and dry day to one with snow on the ground with just a few clicks. As the car drove down the road, you could see some of the Smart City sensors that are physically (and virtually) embedded in the Area X.O environment.

“You can see as we pull up, all of the sensors in the environment are visible. That kind of demonstrates what we’re able to do with this model,” he said.

 

VIRTUAL DRONE FLIGHT

 

Samei then moved on to programming an autonomous drone flight over one of the experimental farm fields that surround the Area X.O facility. For this portion of the demo, he utilized the Ansys STK toolkit – specifically designed for Digital Mission Engineering. You’ll recall Dr. Stoute spoke of this, and its ability to simulate entire systems – including ground stations, satellite communication, etc.

Samei defined the area of the field to be scanned, then “built” the quadcopter by selecting motors, battery, propellors – even the pitch of the blades.

“We end up with a very accurate model of a drone that reflects its actual performance,” he said.

He also programmed the altitude of the drone and the density of the scan – with passes over the field 400′ apart. With that and a few more clicks (all in real-time, which was pretty impressive to watch), he sent the drone off on its mission.

The virtual drone quickly scanned the desired area and returned to base with power to spare. Samei then plotted a more tightly focussed grid – lower altitude and more overlap, with grid passes 200′ apart – for greater data density. Then he send the quadcopter off again.

In this example, Samei was interested in whether the quadcopter could cover the scan with its existing power supply. He was also keen to learn if the ground station would be able to communicate with the drone throughout its mission. Both of these questions were answered in the affirmative without having to use a physical drone.

“We were able to verify the flight does not need more energy than the battery can provide,” he observed. “We can (also) see the minimum signal strength required – so indeed we are able to maintain consistent communication throughout the mission.”

That was impressive enough. But get this: The simulation software can even account for potential signal interference caused by buildings. And such flights – whether it’s a drone or a Cessna or a business jet – are not limited to Area X.O. Ansys STK has a database or pretty much anywhere on the planet.

“You can simulate your missions and flights over anywhere on earth,” said Samei.

 

Below: A screen capture during Samei Hossain’s real-time demo. Here, he’s configuring the technical parameters for a simulated quadcopter’s propulsion system

Area X.O Ansys simulator

WAIT, THERE’S MORE

 

The real-time demo was impressive. But it left one wondering: What kind of a computer do you need to make these kind of simulations actually work? Surely the computational power required exceeds what most of us carry around on our laptop.

And that’s true. But the good news is, the Area X.O simulator portal includes access to the precise kind of computer required.

“What we’re providing with our simulation services is access to our computers,” said Samei.

“We have the workstations necessary that have the computational power, the memory, that’s able to simulate these problems very fast. So it’s not necessary for the clients to have a supercomputer in order to run the simulations. We can take that 10-day simulation time down to 10 hours.”

 

THE VIRTUAL ADANTAGE

 

If it wasn’t clear by now (and it surely was), the webinar wrapped with a reminder of why simulation is such a powerful and cost-effective tool for developers.

“We can do more different physics-based simulations such that you don’t have to build…expensive prototypes,” said Dr. Stoute. “People can actually imagine the wildest designs without any limitations. Having your wildest dreams imaginable.”

Engineer Hossain Samei also weighed in.

“One thing I really do believe in is: Knowledge is power,” he said.

“What simulation…lets us know (is) what’s going to happen and not suffer the consequences from actually having to make a product…and then find out: ‘Oops, I have a problem’. Simulation allows you to circumvent that and identify these issues before, where it’s easier to actually solve them.”

 

WANT TO TRY IT?

 

You can! Though the Area X.O simulation portal is ultimately a paid service, those interested in learning more can sign up for further free demos to get a better sense of what this resource is capable of delivering.

Sign up for free on this page.

If you thought you missed a cool demo, you did. But no worries, you can watch a replay of the entire webinar below:

INDRO’S TAKE

 

The Ansys platform is acknowledged as the best simulation platform going. And with the expertise of Area X.O engineers Dr. Barry Stoute and Samei Hossain, we’re confident a solution can be tailored for pretty much any product operating in any environment.

“It’s a normal part of R&D to go through various iterations of products following real-world testing,” says InDro Robotics CEO Philip Reece. “And while products ultimately need to be tested in the real world prior to deployment, high-level simulation can save time, money – and mistakes.

“Even though our R&D hub is situated right at Area X.O, we plan on tapping into this powerful tool to analyze some of our products currently on the drawing board.”

If you’re interested in learning more about this new tool, drop Area X.O a line here

 

InDro hires Head of Strategic Innovations

InDro hires Head of Strategic Innovations

By Scott Simmie

 

As a Research and Development company, InDro Robotics is – by necessity – engineering-heavy. Our staff at Area X.O in Ottawa and in British Columbia are constantly pushing the envelope when it comes to inventing and deploying new solutions in hardware, software and service provision.

As a result, much of the focus of our hiring in the past couple of years has been expanding our engineering staff.

But with a growing number of InDro products and clients, it’s also important to identify and develop key partnerships. And on that front, we’re pleased to announce a non-engineer hire. Stacey Connors joins the InDro team as Head of Strategic Innovations.

The role is about the big picture – and a long-term vision of planning and executing InDro’s growth trajectory.

“My role is to find where we want to go, find the vertical that InDro should lean into, then determine what infrastructure we need based on our initial customer understanding and discoveries.”

It’s a big job. And Stacey comes with the requisite experience.

Stacey Connors

FedEx

Connors comes to InDro after a 12-year, high-level run at FedEx, the global leader in express transportation. Beginning as an account executive, she went on to positions in Strategic Development, became a Worldwide Account Manager, and was a District Manager when she made the leap to InDro.

Much of her work with FedEx involved B2B development. She worked with a variety of different verticals, including aerospace, retail, healthcare and manufacturing. She comes with a special knack for putting pieces together.

“What I enjoyed about it was twofold,” she says: “Finding the intersection between the solution that my organization had available and the need or problem that the customer’s trying to solve.”

Leap of faith

 

Connors says she truly enjoyed her work at FedEx. But when the opportunity at InDro came along, she felt ready for a new challenge that would push her beyond her comfort zone.

“I was craving something wildly different,” she says. “I hadn’t remotely thought about robotics and laughed when Peter (Peter King, Head of Robotic Solutions) first mentioned it. But it was a personal opportunity to get uncomfortable, be challenged, and work on the edge – where you have to be sharp.” 

Connors has quickly jumped in, traveling to Area X.O on her first week to meet a visiting robotics company from Europe and a delegation from NAV CANADA. While there, she quickly observed one of InDro’s key strengths.

“In my first few days it was very obvious that there’s a cohesiveness among all individuals in the organization. Everyone fully understands the business objectives we’re trying to achieve and the value that each of them bring,” she says. “When I walk into an organization and see that collective spirit, that’s the horse I’m going to bet on.”

 

Solutions

 

 

Drawing on her FedEx experience, Connors says she’s excited to start identifying companies that might benefit from InDro’s many robotic solutions – including a new inventory drone system that autonomously scans warehouse stock. But while sales may well result from her work, her role is really about the bigger strategic picture as InDro continues to grow.

“Yes, I’ll be leveraging our R&D capabilities to accelerate specific industries in their use of these technologies,” she says. “But I really see InDro as an integrator – and that’s almost how I would describe my role. We have research and development, the newest and latest and greatest. I’ll be going out and seeing who has other pieces we don’t have and bringing them all together. And that really gets me excited.”

Canada Robotics

Other expertise

 

Connors, in addition to her accomplishments at FedEx, has other expertise that will serve her well in this role. She has a Bachelor of Health Science from Wilfrid Laurier University,  along with a post-graduate degree from the University of the Sunshine Coast in Australia. She’s also a certified Talent Management Practitioner, has gone through the Ivey Sales Leadership Program, has studied Emotional Intelligence at McMaster’s DeGroote School of Business, and is also a Certified Multipliers Leader – with the latter meaning she has expertise to help bring out the greatness in others.

But she’d rather talk about InDro – and what she’s learned since coming onboard – than about herself.

“At FedEx our operators were the core and value of the company. And it is obvious that the engineers are the core value of this company,” she says. “Research and development is that incessant hunger to continue to provide new options, new solutions, new technologies. And you can feel that spirit here.”

Autonomous Cars

InDro’s take

 

The hiring of Stacey Connors as Head of Strategic Innovations is significant for a couple of reasons. The first, obviously, is that she brings proven skills, expertise, and an outstanding reputation.

But the second is really about the timing.

InDro Robotics has been growing steadily. In the last two years our team has developed and deployed multiple new products and services, and we are working with several global technology companies. Our engineering team has continued to grow.

“We are at a significant juncture in the company’s trajectory,” says CEO Philip Reece. “While InDro will always be an engineering-first firm, we are now at the stage of securing strategic partnerships to ensure the next phase of growth. Stacey is the right person, in the right position, at the right time.”