Dual manipulator Rosie the robot used for Industry 4.0 research

Dual manipulator Rosie the robot used for Industry 4.0 research

By Scott Simmie

 

At least some of you will remember The Jetsons.

The television series, created by Hanna-Barbera Cartoons Inc., was a space-age version of The Flintstones (another Hanna-Barbera production). It originally aired in 1962-1963 with later episodes created in a reboot from 1985 to 1987.

But while Fred Flintstone drove a stone-age car (complete with stone wheels) that he powered by pushing his feet along the ground, George Jetson and his family lived in Orbit City, where Jetson commuted to his two-hour per week job via a flying car with a bubble top. And instead of having dinosaurs (including pterodactyls) help carry out tasks, The Jetsons live in a future where they’re surrounded by automated devices. You could think of their surroundings as the 1960s vision of the Smart Home.

And an integral part of that home? Well, that would be Rosey (later changed to ‘Rosie’) the robot.

Rosey was the family’s robotic maid. She carried out tasks that weren’t performed by the many other automatic conveniences that filled the Jetson’s home. She had two manipulator arms and an internally stored vacuum that be deployed on demand.

She was very useful around the house, carrying out tasks to save the family time.

And this story? Well, it’s about our own Rosie – which is also very space-age.

Below: A Rosie the robot publicity cel, signed by show creators William Hanna and Joseph Barbera. The cel was auctioned in 2018; image by Heritage Auctions

Rosie the Robot from The Jetsons Heritage Auctions image

THE ROSIE STORY

 

So. What is Rosie? We asked Head of R&D Sales Luke Corbeth for a snapshot.

“Rosie is a dual arm mobile manipulation robot designed for pick and place in an industry 4.0 setting,” he says. In other words, it has two arms and manoeuvres on a wheeled platform, and is capable of moving objects from one location to another or even manipulating a single object with both end effectors.

And Rosie has a few tricks up her sleeve. Or, more accurately, sleeves.

“The actual robot is very unique because it has six mounting points for the arms. So you can mount the arms on top, high on the side or low on the side to access shelving of different heights. In fact, you could actually mount one arm directly on the top right, for example, and then mount the second one on the bottom left. So you could grab something from the top of the shelf and from the floor at the same time, which is kind of cool, right?”

Yes, indeed.

Rosie’s home is not with the Jetsons (she has no vacuum cleaner) but in a new lab that hasn’t yet been officially launched at Polytechnique Montréal. It’s called the Intelligent-Cyber Physical System Lab, or I-CPS. So we contacted Lionel Birglen, a professor with the Department of Mechanical Engineering. We wanted to learn more about what the lab does, what he does – and what plans he has for Rosie (which InDro built and shipped in 2023).

Dr. Birglen is a PhD Mechanical Engineer, with a specialisation in robotics. He’s particularly interested in – and an expert on – manipulators and end effectors and has designed and built them. He’s written two books, holds three patents, and is the author or contributing author of at least 94 research papers. He’s also – get this – been listed among the top two per cent most-cited scientists in the world in his area of specialisation.

So it kinda goes without saying, but he’s a pretty big deal in this field.

Dr. Birglen has a deep interest in the role robotics will play in the future of industry. And, within that realm, he’s intensely interested in ensuring that robots, particularly those that will be sharing space with human beings on a factory or warehouse floor, will be safe.

And – he emphasises – he doesn’t trust simulations for important work like this.

“Because simulations lie. They lie all the time,” he says. “You have to understand that reality is infinitely more complex than anything you can have in simulation – so actual experiments are absolutely essential to me. They are essential to my work, to my understanding of what robotic manipulation is.”

“I believe in math, but I know that reality is different. It’s more complex, more complicated, and includes so many un-modelled phenomena.”

 

ROSIE’S JOURNEY

 

Dr. Birglen knew he wanted a new robot for use in the new lab (which we’ll get to shortly). And he knew he wanted a robot with two manipulator arms.

“Dual-arm robots are, in my opinion, the future for industry applications,” he says.

And while humanoid bipeds grab a lot of attention, they’re far more complex (and expensive) than wheeled robots. Plus, he says, most factory applications take place on a single level and don’t require climbing stairs.

“From a factory perspective, a wheeled platform makes a lot of sense because typically in factories you don’t have, say, five levels connected by stairs.”

So he knew he wanted an autonomous, wheeled, dual-arm robot. And he started, initially, to think of a company other than InDro for the build.

“I came across InDro almost by accident,” he explains. “Giovanni Beltrame told me about you because he has purchased many, many robots from you. He said: ‘Those guys can build and assemble the robot for you. They’re close and they do a great job.’ So that’s how I came in contact with you.” (We’ve written previously about the amazing work Dr. Beltrame is working on involving robots and space. You can find that here.)

And so, after a number of calls with Luke Corbeth and the engineering team to settle on design and performance parameters, work on Rosie began.

Below: Technologist Tirth Gajera (‘T’) puts the finishing touches on Rosie in 2023

Rosie and Tirth T

THE LAB

 

Polytechnique Montréal’s Intelligent-Cyber Physical System Lab (I-CPS) is set up as a highly connected Industry 4.0 factory. Faculty from four different departments – computer engineering, electrical engineering, industrial engineering and mechanical engineering (Dr. Birglen) – are involved with the lab. Interns and students, under supervision, also work in the facility.

“So we have four departments involved in this lab and the idea is to build a small scale factory of the future, meaning that everything is connected. We are building a mini-factory inside this lab,” he says.

So think of cameras that can track objects on shelves – and people and robots within the environment. Think of smart tools like a CNC machine, which will eventually be operated by Rosie. And, perhaps just as important as the connectivity within the lab, is connectivity to other research institutes in Quebec, including Université Laval, Université de Sherbrooke and École de Technologié Supérieure (ÉTS). All of those institutes are working with similar mini-factories, and they’re all connected. There’s even a relationship (and connectivity) with manipulator manufacturer Kinova. Funding came via a significant grant from the Canada Foundation for Innovation, or CFI.

“So think of our lab as like one node of this network of mini-factories around Quebec,” explains Dr. Birglen. That connectivity of all components is still a work-in-progress, but “ultimately the goal is that there is a cyber-connection between these different mini-factories, these different laboratories around Quebec, so that one part of one node can work in collaboration with another node in realtime.”

Plus, of course, a lot of learnings will take place within the individual labs themselves.

“We want to bring collaborative robots to work in tandem with humans,” he says. “We want our robots to safely move around people, we want robots to help people. And we also want robots to learn how to work from people.”

 

SAFETY, SAFETY, SAFETY

 

As mentioned earlier, there’s a huge emphasis on safety. And while there are international safety standards for collaborative robots, even a ‘safe’ cobot can pose a threat.

“All the collaborative robots that you have currently on the market more or less follow this technical standard and they are more or less safe, but they’re still dangerous,” explains Dr. Birglen. “And the classical example that we’ve all heard, and which is true, is that if a safe cobot has a knife in its hand and is moving around – it is very dangerous.”

So safety in the lab(s) is paramount – and that means safety at multiple levels. There must be safety:

  • At the task level; you must not have tasks that could endanger people
  • Safety at the control level
  • Safety in terms of collision detection, mitigation, obstacle avoidance
  • Safety at the data security level

Plus – and this really interests Dr. Birglen – you must ensure safety with any additional mechanical innovations that are introduced.

“What you develop, any mechanical system you develop, must be as much as possible intrinsically safe. And actually that’s one of the topics I’m currently working on is to develop end effectors and tooling that is intrinsically safe.”

Below: A LinkedIn post from Luke Corbeth shows Rosie, using both arms, inside the I-CPS lab

THE FUTURE

 

And why is research like this so important? What difference will it make to have robots and humans working safely together, with safe manipulators and end effectors that might even be able to, for example, lift an object in concert with a human being? And why the focus on interconnectedness between all of these facilities?

Well, there’s obviously the value of the research itself – which will lead to greater efficiencies, improved manipulators, gripping technologies, new algorithms and AI enhancements – as well as enhanced safety down the road. But there’s a much bigger picture, says Dr. Birglen, especially if you can get your head around thinking about the future from a global perspective.

China, he says, is no longer a developing nation. The days when the words “Made in China” meant poor quality are – with rare exceptions – gone. The country is, in fact, highly developed – and working at breakneck speed when it comes to innovation and adoption of robotics at scale. A revolution is underway that has massive implications for competitive advantage that simply cannot be ignored. So the research at  I-CPS is not merely important from an academic perspective, it’s strategic when viewed through a global economic lens.

“We as a country – meaning Canada – are in competition with other countries for manufacturing, for producing goods and services. China is a developed country and it is very, very, very good in robotics,” he states. “You know how in the past we saw China as producing low quality goods, low quality robots? That’s over, man. That’s finished.”

And?

“If they are investing in robotics like mad and we are not, we’re going to be a leftover – Canada is going to sink as a rich country. If you want to produce wealth in the 21st Century, you need robots, you need automation, you need integration. In short, you need to be the leader of the pack or you’re going to be eaten.”

It’s a stark warning – and it’s true.

I step outside as author and state this having lived in China back when it was still a developing country in the late 1980s – and having returned several times since then. The transformation has been nothing short of astonishing. How, you might ask, did it achieve all this?

The answer has its genesis with former Chinese leader Deng Xiaoping. who led the country from 1978 to 1989. He didn’t merely open the door to reform; he created policies that began sending waves of students from what had been a xenophobic country abroad to study. There was an emphasis on careers that could help modernise the nation, including all aspects of engineering, aerospace, construction, transportation, architecture, etc. That’s where all this began.

Thankfully (and with credit to federal funding agencies like CFI), there are projects like I-CPS underway – and academics like Dr. Lionel Birglen with the vision to push the needle safely forward.

Below: “Baxter” – the original dual-arm robot. Baxter is still at Polytechnique Montréal, but Rosie is the mobile future. Photo by Luke Corbeth

Baxter
Rosie

INDRO’S TAKE

 

We’re obviously pleased Polytechnique Montréal selected InDro to build Rosie. And we’re particularly pleased to see that she’s being deployed at I-CPS, as part of an integrated and networked research project that has such potentially profound implications for the future.

“I believe Dr. Birglen is correct in his assessment of the importance of robotics and automation in the future,” says InDro Robotics Founder and CEO Philip Reece. “And when you throw innovations with drones and even autonomous Uncrewed Aerial Vehicles capable of carrying large cargo loads and passengers into the mix, we are actually heading into a Jetsons-like future,” he adds.

“I think there’s a growing understanding of the implications of this kind of future from not only the private sector, but also federal regulators and funding agencies. At InDro our mission will always focus on continued innovation. Sometimes those innovations are our own inventions, but a key piece of the puzzle is R&D work carried out by academics like Lionel Birglen. We’re confident that Rosie’s arms are in the right hands.”

Interested in learning more about a custom robotics solution? Feel free to contact us here.

InDro Robotics ROS-based drone an R&D powerhouse

InDro Robotics ROS-based drone an R&D powerhouse

By Scott Simmie

 

InDro Robotics is pleased to unveil details of its highly capable new R&D drone.

Running the Robot Operating System (ROS) and with powerful onboard compute capabilities, the drone is perfect for advanced Research and Development.

“It’s a drone geared toward R&D first and foremost,” explains Luke Corbeth, Head of R&D Sales. “It truly is a flying robot – and you can program and use it in a very similar fashion to all our other robots.”

There’s a real demand in the research world for open-source drones that can be programmed and run highly complex algorithms. These kinds of drones can be used to study swarm behaviour, object detection and identification, mapping in GPS-denied locations and much more.

For some researchers, the budget go-to has been the Crazyflie, a micro-drone that uses a Raspberry Pi for compute. Its advantage is that it’s quite affordable. But its low cost, 27 gram weight and relatively low computing power means it has limitations – including the ability add sensors of any weight.

“This drone can do so much more,” says Corbeth. “With the NVIDIA Xavier NX onboard for compute, it can effectively map entire environments. And when it comes to landing and object recognition, it’s truly phenomenal. It can even land on a moving vehicle.”

Below: A look at InDro’s new drone, which comes complete with LiDAR, a depth-perception camera, 5G connectivity – and much more.

InDro ROS drone

THE BACK STORY

 

If you’ve been following the latest news from InDro, you’ll be aware we have an incubation agreement with Cypher Robotics. That company builds solutions for cycle counting and precision scanning in the industrial/supply chain space. InDro assisted with the development of its signature product, Captis.

Captis integrates an autonomous ground robot with a tethered drone. As the Captis robot autonomously navigates even narrow stock aisles, the drone ascends from a tether attached to that ground robot. The drone then scans the barcodes (it’s code-agnostic) of the products on the shelves. All of that data is transferred seamlessly, in real-time, to the client’s Warehouse Management System (WMS), WCS (Warehouse Control System) and WES (Warehouse Execution System) software.

The capabilities of Captis led to a partnership with global AI fulfilment experts GreyOrange and leading global telco innovator Ericsson. The product debuted at the recent MODEX2024 conference (one of the biggies in the automated supply chain world), where it gained a *lot* of attention.

While working on the project, it was always clear the drone – thanks to multiple modifications – would be highly suitable as a research and development tool. It’s capable of machine vision/object recognition, machine learning, and can find its way around in completely unfamiliar, GPS-denied environments.

“In fact, I have one client that’s using it for research in mines,” says Corbeth.

 

THE JETSON DIFFERENCE

 

NVIDIA has made quite a name for itself – and quite a profit for its shareholders – with its powerful AI-capable processors. The Jetson Xavier NX features a 6-core NVIDIA Carmel Arm®v8.2 64-bit processor running at speeds of up to 1.9 GHz. Its graphics processor unit features a 384-core NVIDIA Volta™ architecture with 48 Tensor Cores. Put it all together, and the computing power is astonishing: The Xavier NX is rated with a maximum achievable output of 21 TOPS – trillion operations per second. (We were going to try to count, but thought it more efficient to rely on NVIDIA’s specs for this.)

The LiDAR unit currently shipping with the drone also has some flex. It’s the Ouster 32-channel OS1 (Rev6.2). With a maximum range of 200 metres (90 metres on a dark, 10 per cent target), its powerful L3 chip is capable of processing scans of up to 5.2 million points per second with 128 channels of vertical resolution (again, we didn’t count). Hostile environment? No problem. The LiDAR can operate from -40°C to 60°C and has an IP68 Ingress Protection rating.

The OS1 is designed for all-weather environments and use in industrial automation, autonomous vehicles, mapping, smart infrastructure, and robotics,” states its manufacturer“The OS1 offers clean, dense data across its entire field of view for accurate perception and crisp detail in industrial, automotive, robotics, and mapping applications.”

The unit uses open source ROS and C++ drivers, and comes with Ouster’s Software Development Kit. Its ability to accurately sense its environment (down to distances of 0.5 metres away), combined with the NVIDIA processor and the depth camera also allows this machine to do something pretty extraordinary: It can recognise and land on a moving platform.

“That’s a very challenging problem to solve and requires not only specific sensing but also really powerful onboard compute. This drone can do it,” explains Corbeth.

Already, word about the product has been spreading. A number of units have already been sold to academic institutions for research purposes – and the team has been hard at work building and testing for the next set of orders (as seen below).

THE FORGE CONNECTION

 

Like all new products, the new drone required custom parts. We looked no further than InDro Forge, our rapid prototyping and limited production run facility in Ottawa.

Using state of the art additive and subtractive tools, the Forge team created custom mounts using carbon fibre and other strong but lightweight materials, while also ensuring the frame was robust enough to take on even the most challenging environments where these drones will be deployed.

“InDro Forge has been critical to the finished product,” says Corbeth. “We wanted a look, feel and quality that matches this drone’s capabilities – and InDro Forge delivered.”

InDro ROS drone

INDRO’S TAKE

 

We’re obviously excited about the capabilities of this new drone, and we’re not alone. Interest in this product from researchers has already been significant. In fact, we’re not aware of any other drone on the market offering this combination of specific capabilities.

It was that void – in concert with our partnership with Cypher Robotics – that led to its creation.

“InDro has always placed a great emphasis on the development of innovative new products,” says CEO Philip Reece. “We build new products at the request of clients and also develop our own when we see a market opportunity. In this case, the requirements for Cypher Robotics dovetailed nicely with demand for such a drone from researchers.”

Production of the new drone is moving at a swift pace. If you’re interested in a briefing or demo, you can contact us here.

QUEBEC’S HAPLY ROBOTICS MAKES THE VIRTUAL FEEL REAL

QUEBEC’S HAPLY ROBOTICS MAKES THE VIRTUAL FEEL REAL

By Scott Simmie

 

Odds are you’ve heard of remote surgery by now.

That’s where a surgeon, looking at screens that provide incredibly detailed 3D video in realtime, conducts the operation using a controller for each hand. The inputs on those controllers are translated into scaled-down movement of robotic arms fitted with the appropriate medical devices. The robotic arms are capable of moving a precise fraction of the distance of the operators’ hands. As a result, these systems allow for far greater control, particularly during really fine or delicate procedures. 

The surgeon might be at a console in the operating theatre where the patient is. Or they could be operating on someone remotely. You could have a specialist in Montreal perform an operation on someone elsewhere in the world – providing you’ve got a speedy data connection.

The video below does a really good job of explaining how one of the best-known systems works. 

 

THE IMPORTANCE OF FEEL

 

Conducting standard surgery (or a variety of other tasks) without robots involves constant tactile feedback.  If a doctor is moving an instrument through tissue – or even probing inside an ear – they can feel what’s going on. Think of cutting a piece of fruit; you adjust the pressure on the knife depending on how easy the fruit is to slice. When you put a spoon into a bowl of jello, that constant feedback from the utensil helps inform how hard or soft you need to push.

This tactile feedback is very much a part of our everyday lives – whether it’s brushing your teeth or realising there’s a knot in your hair while combing it. Even when you scratch an itch, you’re making use of this feedback to determine the appropriate pressure and movements (though you have the additional data reaching your brain from the spot being scratched).

But how do you train someone to perform delicate operations like surgery – even bomb defusal – via robotics? How do you give them an accurate, tactile feel for what’s happening at the business end? How much pressure is required to snip a wire, or to stitch up a surgical opening?

That’s where a company from Quebec called Haply Robotics comes in.

“Haply Robotics builds force-feedback haptic controllers that are used to add the sense of touch to VR experiences, and to robotic control,” explains Product Manager Jessica Henry. “That means that our controller sits on the human interface side and lets the human actually use their hand to do a task that is conveyed to a robot that’s performing that task.”

We met some of the Haply Robotics team during the fall at the IROS 2023 conference in Detroit. We had an opportunity for a hands-on experience, and were impressed.

 

INVERSE3

 

That’s the name of Haply’s core product.

“The Inverse3 is the only haptic interface on the market that has been specially designed to be compact, lightweight, and completely portable,” says the company’s website. “Wireless tool tracking enables you to move freely through virtual environments, while our quick tool change mechanism allows you to easily connect and swap VR controllers, replica instruments, and other tools to leverage the Inverse3’s unmatched power and precision for next-generation force-feedback control.

“The Inverse3 replicates tactile sensory input required for simulating technical tasks. It can precisely emulate complex sensations like cutting into tissue or drilling into bone – empowering students, surgeons, and other healthcare professionals to hone and perfect medical interventions before ever performing them in the clinical environment.”

Haply Robotics has produced an excellent video that gives you both a look at the product – and how it works:

 

WHAT DOES IT FEEL LIKE?

 

While at IROS, we had a chance to put our hands on the Inverse3.

In one of the simulations (which you’ll see shortly), the objective was to push a small sphere through a virtual gelatin-like substance. As you start pushing the ball against that barrier, you begin to feel resistance through the handle of the Inverse3. Using force-feedback, you continue to push and feel that resistance increase. Finally, when you’ve hit precisely the correct amount of pressure, the ball passes through the gelatin. The sensation, which included a satisfying, almost liquid ‘pop’ as the ball passed through, was amazing. It felt exactly like you would have anticipated it would feel with a real-world object.

“Touch adds a more information as opposed to just having the visual information,” explains Henry. “You also have the tactile information, so you have a rich amount of information for your brain to make a decision. You can even introduce different haptic boundaries so you can use things like AI in order to add some kind of safety measure. If the AI can say ‘don’t go there’ – it can force your hand out of the boundary with haptic cues. So it’s not just visual, it’s not just audio.”

 

SIMULATION, TRAINING…AND MORE

 

The Inverse3 is already in use for simulation training in the medical industry. In fact, many existing devices for robotic surgery do not have haptics – and there’s clearly a demand.

“Robotic surgical consoles don’t use haptics yet, and we’re hearing that surgeons are asking for that to be added because it’s missing that sense,” says Henry. “A mistake they can make is to push an instrument too far in because it’s just visual. If you had haptics on your handles, you would intuitively know to pull back.”

Remember how we tried pushing a virtual object through a gel-like substance? You’ll see that in this video around the :24 mark:

THE HAPLY STORY

 

Well, it’s not the entire Haply Robotics story, but here it is in a nutshell.

The idea for the product – for the need for such a product – first surfaced in 2016. The three co-founders were working on haptic devices at Canada’s National Research Council. Existing devices then were large and tended to not have the greatest user experience. They saw an opportunity to create something better. The company has been in business since 2018 – with these three at the helm:

  • Colin Gallacher (MEng, MSc, President)
  • Steve Ding (MEng, Electrical lead)
  • Felix Desourdy (BEng, Mechanical lead)

The trio put their heads together and – a lot of R&D later – produced the Inverse3.

The company manufactures the physical product, which contains three motors to provide haptic feedback. Haply Robotics also makes an API, but the coding for the simulations comes from outside partners. Fundamental VR, for example, is a company devoted to developing virtual training simulations for everything from opthamology to endovascular procedures. It coded that gelatin simulation.

“Studies confirm that VR significantly improves the effectiveness of medical education programs. Adding real haptics increases accuracy and delivers full skills transfer,” says the Fundamental VR website. In fact, it cites research showing a 44 per cent improvement in surgical accuracy when haptics are part of the VR experience.

“In the training space, when you’re using it for simulation, a surgeon’s work is very tactile and dexterous,” says Haply’s Jessica Henry. “We enable them to train using those instruments with the proper weights, the proper forces, that they’d encounter in surgery as opposed to textbooks or cadavers. It’s a more enriched way of interacting.”

And it really, really feels real.

Below: Haply’s Jessica Henry manipulates the Inverse3

 

 

Haply Robotics Jessica

INDRO’S TAKE

 

It’s always great discovering another new company in the robotics field, particularly one with an innovative solution like the Inverse3. It’s also great when these companies are Canadian.

“Haply Robotics has identified a clear void in the marketplace and created a solution,” says Indro Robotics CEO Philip Reece. “With the growth in remote robotics – not just surgery – I can see a wide range of use-cases for the Inverse3. Congratulations to the Haply team on being ahead of the curve.”

For more info on the product, check out the Haply Robotics website.

InDro Commander module streamlines robotics R&D

InDro Commander module streamlines robotics R&D

By Scott Simmie

 

Building robots is hard.

Even if you start with a manufactured platform for locomotion (very common in the case of ground robots), the work ahead can be challenging and time-consuming. How many sensors will require a power supply and data routing? What EDGE processing is needed? How will a remote operator interface with the machine? What coding will allow everything to work in unison and ensure the best data and performance possible? How will data be transmitted or stored?

That’s the hard stuff, which inevitably requires a fair bit of time and effort.

It’s that hurdle – one faced by pretty much everyone in the robotics R&D world – that led to the creation of InDro Commander.

InDro Commander

WHAT INDRO COMMANDER DOES

 

InDro Commander is a platform-agnostic module that can bolt on to pretty much any means of locomotion. In the photo above, it’s the box mounted on top of the AgileX bunker (just above the InDro logo).

Commander is, as this webpage explains, “a single box with critical software and hardware designed to simplify payload integration and enable turn-key teleoperations.” Whether you’re adding LiDAR, thermal sensors, RTK, Pan-Tilt-Zoom cameras – or pretty much any other kind of sensor – Commander takes the pain out of integration.

The module offers multiple USB inputs for sensors, allowing developers to decide on a mounting location and then simply plug them in. A powerful Jetson EDGE computer handles onboard compute functions. The complete Robot Operating System software libraries (ROS1 and ROS2) are bundled in, allowing developers to quickly access the code needed for various sensors and functions.

“Our engineering team came up with the concept of the InDro Commander after integrating and customizing our own robots,” says Philip Reece, CEO of InDro Robotics. “We realized there were hurdles common to all of them – so we designed and produced a solution. Commander vastly simplifies turning a platform into a fully functioning robot.”

Account Executive Luke Corbeth takes it further:

“The Commander serves as a “brain-box” for any UGV,” he says. “It safely houses the compute, connectivity, cameras, sensors and other hardware in an IP54 enclosure.”

It also comes in several options, depending on the client’s requirements.

“There are three ‘standard versions’ which are bundles to either be Compute Ready, Teleoperations Ready or Autonomy Ready,” adds Corbeth.

“I’ve realized over time that the value of Commander is our ability to customize it to include, or more importantly, not include specific components depending on the needs of the project and what the client already has available. In reality, most Commanders I sell include some, but not usually all, of what’s in the Commander Navigate. We’re also able to customize to specific needs or payloads.”

Below: Commander comes in multiple configurations

InDro Commander

COMMANDER DOES THE WORK

 

With InDro Commander, developers can spend more time on their actual project or research – and far less time on the build.

“For end-users wanting a fully customized robot, Commander saves a huge amount of time and hassle,” says InDro Engineering Lead Arron Griffiths. “Customers using this module see immediate benefits for sensor integration, and the web-based console for remote operations provides streaming, real-time data. Commander also supports wireless charging, which is a huge bonus for remote operations.”

Commander serves as the brains for several InDro ground robots, including Sentinel. This machine was recently put through its paces over 5G in a test for EPRI, the Electric Power Research Institute.

 

5G OPERATIONS

 

Depending on the model, Commander can also serve as a Plug & Play device for operations over 4G or 5G networks. In fact, InDro was invited by US carrier T-Mobile to a 2022 event in Washington State. There, we demonstrated the live, remote tele-operation of a Sentinel inspection robot.

Using a simple Xbox controller plugged into a laptop at T-Mobile HQ in Bellevue WA, we operated a Sentinel in Ottawa – more than 4,000 kilometres away. There was no perceptible lag, and even untrained operators were able to easily control remote operations and cycle between the Pan Tilt Zoom camera, a thermal sensor, and a wide-angle camera used for situational awareness by the operator. Data from all sensors was displayed on the dashboard, with the ability for the operator to easily cycle between them.

Below: T-Mobile’s John Saw, Executive Vice President, Advanced & Emerging Technologies, talks about InDro Commander-enabled robots teleoperating over 5G networks 

 

FUTURE-PROOF

 

Platforms change. Needs evolve. New sensors hit the market.

With Commander on board, developers don’t need to start from scratch. The modular design enables end-users to seamlessly upgrade platforms down the road by simply unbolting Commander and affixing it to the new set of wheels (or treads).

Below: Any sensor, including LiDAR, can be quickly integrated with InDro Commander

Teleoperated Robots

INDRO’S TAKE

 

You likely know the saying: “Necessity if the mother of invention.”

InDro developed this product because we could see its utility – both for our own R&D, and for clients. We’ve put Commander to use on multiple custom InDro robots, with many more to come. (We have even created a version of this for Enterprise drones.)

On the commercial side, our clients have really benefited from the inherent modularity that the Commander provides,” says Luke Corbeth.

“Since the ‘brains’ are separate from the ‘body,’ this simplifies their ability to make the inevitable repairs or upgrades they’ll require. These clients generally care about having a high functioning robot reliably completing a repetitive task, and Commander allows us to operate and program our robots to do this.”

It can also save developers money.

“On the R&D side, the customizable nature of the Commander means they only purchase what they don’t already have,” adds Corbeth.

“For instance, many clients are fortunate enough to have some hardware already available to them whether it’s a special camera, LiDAR or a Jetson so we can support the integration of their existing systems remotely or they can send this hardware directly to us. This cuts down lead times and helps us work within our clients’ budgets as we build towards the dream robot for their project.”

Still have questions or want to learn more? You can get in touch with Luke Corbeth here.

uPenn robotics team cleans up at SICK LiDAR competition

uPenn robotics team cleans up at SICK LiDAR competition

By Scott Simmie

 

There’s nothing we like more than success stories – especially when technology is involved.

So we’re pleased to share news that a team of bright young engineers from the University of Pennsylvania were the winners of a prestigious competition sponsored by SICK, the German-based manufacturer of LiDAR sensors and industrial process automation technology.

The competition is called the SICK TiM $10K Challenge. The competition involves finding innovative new uses for the company’s TiM-P 2D LiDAR sensor. Laser-based LiDAR sensors scan the surrounding environment in real-time, producing highly accurate point clouds/maps. Paired with machine vision and AI, LiDAR can be used to detect objects – and even avoid them.

And that’s a pretty handy feature if your robot happens to an autonomous garbage collector. We asked Sharon Shaji, one of five UPenn team members (all of whom earned their Masters in Robotics this year), for the micro-elevator pitch:

“It’s an autonomous waste collection robot that can be used specifically for cleaning outdoor spaces,” she says.

And though autonomous, it obviously didn’t build itself.

Below: Members of the team during work on the project.

uPenn Sauberbot

THE COMPETITION

 

When SICK announced the contest, it had a very simple criteria: “The teams will be challenged to solve a problem, create a solution, and bring a new application that utilizes the SICK scanner in any industry.”

SICK received applications from universities across the United States. It then whittled those down to 20 submissions it felt had real potential, and supplied those teams with the TiM-P 270 LiDAR sensor free of charge.

Five students affiliated with UPenn’s prestigious General Robotics, Automation, Sensing and Perception Laboratory, or GRASP Lab, put in a team application. It was one of three GRASP lab teams that would receive sensors from SICK.

That Lab is described here as “an interdisciplinary academic and research center within the School of Engineering and Applied Sciences at the University of Pennsylvania. Founded in 1979, the GRASP Lab is a premier robotics incubator that fosters collaboration between students, research staff and faculty focusing on fundamental research in vision, perception, control systems, automation, and machine learning.”

Before we get to building the robot, how do you go about building a team? Do you just put smart people together – or is there a strategy? In this case, there was.

“One thing we all kept in mind when we were looking for teammates was that we wanted someone from every field of engineering,” explains Shaji. In other words, a multidisciplinary team.

“So we have people from the mechanical engineering background, electrical engineering background, computer science background, software background. We were easily able to delegate work to every person. I think that was important in the success of the product. And we all knew each other, so it was like working with best friends.”

 

GENESIS

 

And how did the idea come about?

Well, says the team (all five of whom hopped on a video call with InDro Robotics), they noticed a problem in need of a solution. Quite frequently on campus – and particularly after events – they’d noticed that the green space was littered. Cans, bottles, wrappers – you name it.

They also noticed that crews would be dispatched to clean everything up. And while that did get the job done, it wasn’t perhaps the most efficient way of tackling the problem. Nor was it glamorous work. It was arguably a dirty and dull job – one of the perfect types of tasks for a robot to take on.

“Large groups of people were coming in and manually picking up this litter,” says Shaji.

“And we realised that automation was the right way to solve that problem. It’s unhygienic, there are sanitation concerns, and physically exhausting. Robots don’t get tired, they don’t get exhausted…we thought this was the best use-case and to move forward with.”

Below: Working on the mechanical side of things

uPenn SICK Sauberbot

GETTING STARTED

 

You’d think, with engineers, the first step in this project would have been to kick around design concepts. But the team focussed initially on market research. Were there similar products out there already? Would there be a demand for such a device? How frequently were crews dispatched for these cleanups? How long, on average, does it take humans to carry out the task? How many people are generally involved? Those kinds of questions.

After that process, they began discussing the nuts and bolts. One of the big questions here was: How should the device go about collecting garbage? Specifically, how should it get the garbage off the ground?

“Cleaning outdoor spaces can vary, because outdoor spaces can vary,” says team member Aadith Kumar. “You might have sandy terrain, you might have open parks, you might have uneven terrain. And each of these pose their own problems. Having a vacuum system on a beach area isn’t going to work because you’re going to collect a lot of sand. The vision is to have a modular mechanism.”

A modular design means flexibility: Different pickup mechanisms would be swappable for specific environments without requiring an entirely new robot. A vacuum system might work well in one setting, a system with the ability to individually pick items of trash might work better somewhere else.

The team decided their initial prototype should focus on open park space. And once that decision was made, it became clear that a brush mechanism, which would sweep the garbage from the grass into a collection box, would be the best solution for this initial iteration.

“We considered vacuum, we considered picking it up, we considered targeted suction,” says Kumar. “But at the end of the day, for economics, it needed to be efficient, fast, nothing too complicated. And the brush mechanism is tried and tested.”

Below: Work on the brush mechanism

 

 

uPenn SICK Sauberbot

SAUBERBOT

 

The team decided to call its robot the SauberBOT. “Sauber” is the German word for “clean”. But that sweeping brush mechanism would be just one part of the puzzle. Other areas to be tackled included:

  • Depth perception camera for identifying trash to be picked up
  • LiDAR programmed so that obstacles, including people, could be avoided
  • Autonomy within a geofenced location – ie, the boundaries of the park to be cleaned

There was more, of course, but one of the most important pieces of the puzzle was the robotic platform itself: The means of locomotion. And that’s where InDro Robotics comes in.

 

THE INDRO CONNECTION

 

Some team members had met InDro Account Executive Luke Corbeth earlier in the year, at the IEEE International Conference on Robotics and Automation, held in Philadelphia in 2022. Corbeth had some robotic platforms from AgileX – which InDro distributes in North America – at the show. At the time the conference took place, the SICK competition wasn’t yet underway. But the students remembered Corbeth – and vice versa.

Once the team formed and entered the contest, discussions with InDro began around potential platforms.

The team was initially leaning toward the AgileX Bunker – a really tough platform that operates with treads, much like a tank. At first glance, those treads seemed like the ideal form of locomotion because they can operate on many different surfaces.

But Luke steered them in a different direction, toward the (less-expensive) Scout 2.0.

“He was the one who suggested the Scout 2.0,” says Udayagiri.

“We actually were thinking of going for the Bunker – but he understood that for our use-case the Scout 2.0 was a better robot. And it was very easy to work with the Scout.”

Corbeth also passed along the metal box that houses the InDro Commander. This enabled the team to save more time (and potential hassle) by housing all of their internal components in an IP-rated enclosure.

“I wanted to help them protect their hardware in an outdoor environment,” he says. “They had a tight budget, and UPenn is a pretty prominent robotics program in the US.”

But buying from InDro begs the question: Why not build their own? A team of five roboticists would surely be able to design and build something like that, right? Well, yes. But they knew they were going to have plenty of work on their own without having to build something from scratch. Taking this on would divert them from their core R&D tasks.

“We knew we would do it in a month or two,” says the team’s Rithwik Udayagiri. “But that would have left us with less time for market research and actually integrating our product, which is the pickup mechanism. We would have been spending too much time on building a platform. So that’s why we went with a standalone platform.”

It took a little longer than planned to get the recently released Scout 2.0 in the hands of the UPenn team. But because of communication with Luke (along with the InDro-supplied use of the Gazebo robot simulation platform), the team was able to quickly integrate the rest of the system with Scout 2.0 soon after it arrived.

“The entire project was ROS-based (Robot Operating System software), and they used our simulation tools, mainly Gazebo, to start working on autonomy,” explains Corbeth. “Even though it took time to get them the unit, they were ready to integrate their tech and get it out in the field very quickly. That was the one thing that blew me away was how quickly they put it together.”

It wasn’t long before SauberBOT was a reality. The team produced a video for its final submission to SICK. The SauberBOT team took first place, winning $10,000 plus an upcoming trip to Germany, where they’ll visit SICK headquarters.

Oh, and SauberBOT? The team says it cleans three times more quickly than using a typical human crew. 

Here’s the video.

 

A CO-BOT, NOT A ROBOT

 

Team SauberBOT knows some people are wary of robots. Some believe they will simply replace human positions and put people out of work.

That’s not the view of these engineers. They see SauberBOT – and other machines like it – as a way of helping to relieve people from boring, physically demanding and even dangerous tasks. They also point out that there’s a labour shortage, particularly in this sector.

“The cleaning industry is understaffed,” reads a note sent by the team. “We choose to introduce automation to the repetitive and mundane aspects of the cleaning industry in an attempt do the tasks that there aren’t enough humans to do.”
 
 
And what about potential jobs losses?
 
 
“We intend to make robots that aren’t aimed to replace humans,” they write.
 
 
“We want to equip the cleaning staff with the tools to handle the mundane part of cleaning outdoor spaces and therefore allow the workforce to target their attention to the more nuanced parts of cleaning which demand human attention.”
 
In other words, think of SauberBOT as a co-operative robot meant to assist but not replace humans. These are sometimes called “co-bots.” 
 
 
Below: Testing out the SauberBOT in the field
UPenn SICK SauberBOT

INDRO’S TAKE

 

We’re obviously pleased to have played a small role in the success of the UPenn team. And while we often service very large clients – including building products on contract for some global tech giants – there’s a unique satisfaction that comes from this kind of relationship.

“It’s very gratifying,” says Corbeth. “In fact, it’s the essence of what I try to do: Enable others to build really cool robots.”

The SauberBOT is indeed pretty cool. And InDro will be keeping an eye on what these young engineers do next.

“The engineering grads of today are tomorrow’s startup CEOs and CTOs,” says InDro Robotics Founder/CEO Philip Reece.

“We love seeing this kind of entrepreneurial spirit, where great ideas and skills lead to the development of new products and processes. In a way, it’s similar to what InDro does on a larger scale. Well done, Team SauberBOT – there’s plenty of potential here for a product down the road.”

If you’ve got a project that could use a robotic platform – or any other engineering challenge that taps into InDro’s expertise with ground robots, drones and remote teleoperations – feel free to get in touch with Luke Corbeth here.

Area X.O unveils new simulation portal

Area X.O unveils new simulation portal

By Scott Simmie

 

Area X.O, the Ottawa facility founded and operated by Invest Ottawa that houses cutting-edge companies involved in robotics and smart mobility R&D, has unveiled a powerful new tool.

It’s a simulation portal that will allow firms to virtually test products under development. Want to put a robot through its paces on the roads at Area X.O to evaluate its propulsion system and battery life? Have a drone overfly and capture data? Maybe you want to test in snow and cold temperatures, despite it being summertime?

Unless you happen to be an Area X.O tenant, carrying out any of these tasks in real life would involve getting permission, getting your product to the site – even waiting for months and taking multiple trips if you wanted to test under a variety of weather conditions. The costs on this would quickly add up, and your development time would stretch.

With the new simulator, you can put your robot or drone (or sensor) through their paces remotely – whether you’re in Ottawa, Vancouver, or even further afield. And you can use the data gathered in the simulator to improve and refine your real-world product.

“Until recently, Area X.O was limited to the physical world,” said Patrick Kenny, Senior Director of Marketing and Communications for Invest Ottawa, Area X.O and Bayview Yards.

“This past winter, Area X.O launched a simulation discovery portal powered by Ansys. The simulation portal and program promotes simulation and its ability to reduce time, cost, effort and risk by getting breakthrough innovations to market faster. Innovators now have a new option to consider.”

Kenny made his remarks during a June 7 webinar. During that event, Area X.O engineers Barry Stoute and Hossain Samei explained how the system works – and even carried out a real-time demonstration.

 

Area X.O simulation portal

POWERED BY ANSYS

 

The brains behind the system come from Ansys, which has been in the simulation software business for more than 50 years. It is widely considered to be the most powerful software of its kind.

“Simulation is an artificial representation of a physical model,” explained simulation engineer Dr. Stoute. He went on to explain, at a high level, two different types of simulation: Finite Element Analysis (FEA) and Digital Mission Engineering.

In a nutshell, FEA uses software (and really good computers) to see how different models behave under different conditions. The model can be anything: A robot, an antenna, a drone – you name it.

“Finite Element Analysis solves for mechanical structures, thermal analysis, electronics and optical (components),” explained Dr. Stoute. Want to know what temperature a component might heat to under load? Determine how a transmitter or antennae might behave in differing temperatures? Even “see” what an optical sensor might capture when mounted on a robot? Plug in the right parameters and powerful computing will give the answer.

 

DIGITAL MISSION ENGINEERING

 

This type of simulation is a way of designing a complex system, particularly where multiple assets interact with another in a simulated environment. In the example seen below, Dr. Stoute says a digital mission engineer could create a model where a drone capturing data interacts with multiple objects. These include satellite communications, a ground station, along with multiple vehicles. The drone’s mission is to capture data from the ground, but the engineer is interested in seeing the Big Picture – the ways in which all these different assets will interact.

The mission engineer can select and modify the parameters of every asset in that model. How powerful is the ground station and what range will it provide? What speed is the aircraft flying at, and at what altitude. What type of aircraft is it? What sensors are on the drone and what are their specifications? What is the battery life? What are the specifications of the drone’s motors? The ambient temperature and wind conditions?

The options are dizzying. But the software – along with a well-trained mission engineer – can create a virtual world where the data outcomes closely predict what would happen in a real-world mission.

“If an engineer creates a physical product and it doesn’t work as planned, they have to go back and remodel it,” explained Dr. Stoute. The simulation environment, by contrast, allows the engineer to tweak that product in a virtual environment without the expense of real-world modifications. Once the product is working well in simulation, those learnings can be applied to the actual physical product.

Plus, of course, weather parameters can easily be changed; something impossible in real-world testing (unless you’ve got lots of time on your hands).

“Should he wait until January to get a blizzard to test the product?” asked Dr. Stoute.

“No, it doesn’t make sense. The simulator can simulate blizzard conditions.”

 

Below: Dr. Stoute explains how Digital Mission Engineering works during the webinar

 

Digital Mission Engineering

REAL-TIME DEMONSTRATION

 

Now that the basics were explained, the webinar moved on to demonstrate these concepts. Area X.O engineer Hossain Samei took over the controls, doing a real-time demo of the sim’s capabilities.

For this, Samei used not only the Ansys core system, but another powerful piece of software called Ansys AVxcelerate, which is used to test and validate sensors for self-driving cars. That means you can plug in virtual sensors, including all of their technical parameters, into the system. And not simply the sensors on the cars. In this simulation, which features a very high-resolution 3D map of the Area X.O complex, Hossain also had sensors that are on the Area X.O site embedded into this virtual world.

“This digital twin also includes the infrastructure embedded into our smart city zone,” explained Samei. “This includes multiple sensors, optical cameras, roadside units, thermal cameras and LiDAR cameras.” The model even includes functioning railroad crossing gates.

“We’re able to simulate the arms moving up and down,” he said.

And remember how the Ansys system can simulate weather? The mission engineer can also tailor lighting conditions – very useful for testing visual sensors.

 

VIRTUAL TEST DRIVE

 

Samei already had the digital twin of Area X.O defined. He then quickly put together an autonomous vehicle and camera sensor using AVxcelerate.

“Once we have our car defined, as well as the sensors on the vehicle, we’re able to move on to choosing a car simulator,” said Hossain.

In order to help the car drive on Area X.O’s terrain, Hossain turned to the Open-Source Webots robot simulator.

“With WeBots, you can define your vehicle, including its suspension, power train and other features to define the vehicle dynamics of the car,” said Samei.

And now? It was time for a drive.

Samei began to pilot the car around Area X.O – showing as well that he could change the setting from a clear and dry day to one with snow on the ground with just a few clicks. As the car drove down the road, you could see some of the Smart City sensors that are physically (and virtually) embedded in the Area X.O environment.

“You can see as we pull up, all of the sensors in the environment are visible. That kind of demonstrates what we’re able to do with this model,” he said.

 

VIRTUAL DRONE FLIGHT

 

Samei then moved on to programming an autonomous drone flight over one of the experimental farm fields that surround the Area X.O facility. For this portion of the demo, he utilized the Ansys STK toolkit – specifically designed for Digital Mission Engineering. You’ll recall Dr. Stoute spoke of this, and its ability to simulate entire systems – including ground stations, satellite communication, etc.

Samei defined the area of the field to be scanned, then “built” the quadcopter by selecting motors, battery, propellors – even the pitch of the blades.

“We end up with a very accurate model of a drone that reflects its actual performance,” he said.

He also programmed the altitude of the drone and the density of the scan – with passes over the field 400′ apart. With that and a few more clicks (all in real-time, which was pretty impressive to watch), he sent the drone off on its mission.

The virtual drone quickly scanned the desired area and returned to base with power to spare. Samei then plotted a more tightly focussed grid – lower altitude and more overlap, with grid passes 200′ apart – for greater data density. Then he send the quadcopter off again.

In this example, Samei was interested in whether the quadcopter could cover the scan with its existing power supply. He was also keen to learn if the ground station would be able to communicate with the drone throughout its mission. Both of these questions were answered in the affirmative without having to use a physical drone.

“We were able to verify the flight does not need more energy than the battery can provide,” he observed. “We can (also) see the minimum signal strength required – so indeed we are able to maintain consistent communication throughout the mission.”

That was impressive enough. But get this: The simulation software can even account for potential signal interference caused by buildings. And such flights – whether it’s a drone or a Cessna or a business jet – are not limited to Area X.O. Ansys STK has a database or pretty much anywhere on the planet.

“You can simulate your missions and flights over anywhere on earth,” said Samei.

 

Below: A screen capture during Samei Hossain’s real-time demo. Here, he’s configuring the technical parameters for a simulated quadcopter’s propulsion system

Area X.O Ansys simulator

WAIT, THERE’S MORE

 

The real-time demo was impressive. But it left one wondering: What kind of a computer do you need to make these kind of simulations actually work? Surely the computational power required exceeds what most of us carry around on our laptop.

And that’s true. But the good news is, the Area X.O simulator portal includes access to the precise kind of computer required.

“What we’re providing with our simulation services is access to our computers,” said Samei.

“We have the workstations necessary that have the computational power, the memory, that’s able to simulate these problems very fast. So it’s not necessary for the clients to have a supercomputer in order to run the simulations. We can take that 10-day simulation time down to 10 hours.”

 

THE VIRTUAL ADANTAGE

 

If it wasn’t clear by now (and it surely was), the webinar wrapped with a reminder of why simulation is such a powerful and cost-effective tool for developers.

“We can do more different physics-based simulations such that you don’t have to build…expensive prototypes,” said Dr. Stoute. “People can actually imagine the wildest designs without any limitations. Having your wildest dreams imaginable.”

Engineer Hossain Samei also weighed in.

“One thing I really do believe in is: Knowledge is power,” he said.

“What simulation…lets us know (is) what’s going to happen and not suffer the consequences from actually having to make a product…and then find out: ‘Oops, I have a problem’. Simulation allows you to circumvent that and identify these issues before, where it’s easier to actually solve them.”

 

WANT TO TRY IT?

 

You can! Though the Area X.O simulation portal is ultimately a paid service, those interested in learning more can sign up for further free demos to get a better sense of what this resource is capable of delivering.

Sign up for free on this page.

If you thought you missed a cool demo, you did. But no worries, you can watch a replay of the entire webinar below:

INDRO’S TAKE

 

The Ansys platform is acknowledged as the best simulation platform going. And with the expertise of Area X.O engineers Dr. Barry Stoute and Samei Hossain, we’re confident a solution can be tailored for pretty much any product operating in any environment.

“It’s a normal part of R&D to go through various iterations of products following real-world testing,” says InDro Robotics CEO Philip Reece. “And while products ultimately need to be tested in the real world prior to deployment, high-level simulation can save time, money – and mistakes.

“Even though our R&D hub is situated right at Area X.O, we plan on tapping into this powerful tool to analyze some of our products currently on the drawing board.”

If you’re interested in learning more about this new tool, drop Area X.O a line here