QUEBEC’S HAPLY ROBOTICS MAKES THE VIRTUAL FEEL REAL

QUEBEC’S HAPLY ROBOTICS MAKES THE VIRTUAL FEEL REAL

By Scott Simmie

 

Odds are you’ve heard of remote surgery by now.

That’s where a surgeon, looking at screens that provide incredibly detailed 3D video in realtime, conducts the operation using a controller for each hand. The inputs on those controllers are translated into scaled-down movement of robotic arms fitted with the appropriate medical devices. The robotic arms are capable of moving a precise fraction of the distance of the operators’ hands. As a result, these systems allow for far greater control, particularly during really fine or delicate procedures. 

The surgeon might be at a console in the operating theatre where the patient is. Or they could be operating on someone remotely. You could have a specialist in Montreal perform an operation on someone elsewhere in the world – providing you’ve got a speedy data connection.

The video below does a really good job of explaining how one of the best-known systems works. 

 

THE IMPORTANCE OF FEEL

 

Conducting standard surgery (or a variety of other tasks) without robots involves constant tactile feedback.  If a doctor is moving an instrument through tissue – or even probing inside an ear – they can feel what’s going on. Think of cutting a piece of fruit; you adjust the pressure on the knife depending on how easy the fruit is to slice. When you put a spoon into a bowl of jello, that constant feedback from the utensil helps inform how hard or soft you need to push.

This tactile feedback is very much a part of our everyday lives – whether it’s brushing your teeth or realising there’s a knot in your hair while combing it. Even when you scratch an itch, you’re making use of this feedback to determine the appropriate pressure and movements (though you have the additional data reaching your brain from the spot being scratched).

But how do you train someone to perform delicate operations like surgery – even bomb defusal – via robotics? How do you give them an accurate, tactile feel for what’s happening at the business end? How much pressure is required to snip a wire, or to stitch up a surgical opening?

That’s where a company from Quebec called Haply Robotics comes in.

“Haply Robotics builds force-feedback haptic controllers that are used to add the sense of touch to VR experiences, and to robotic control,” explains Product Manager Jessica Henry. “That means that our controller sits on the human interface side and lets the human actually use their hand to do a task that is conveyed to a robot that’s performing that task.”

We met some of the Haply Robotics team during the fall at the IROS 2023 conference in Detroit. We had an opportunity for a hands-on experience, and were impressed.

 

INVERSE3

 

That’s the name of Haply’s core product.

“The Inverse3 is the only haptic interface on the market that has been specially designed to be compact, lightweight, and completely portable,” says the company’s website. “Wireless tool tracking enables you to move freely through virtual environments, while our quick tool change mechanism allows you to easily connect and swap VR controllers, replica instruments, and other tools to leverage the Inverse3’s unmatched power and precision for next-generation force-feedback control.

“The Inverse3 replicates tactile sensory input required for simulating technical tasks. It can precisely emulate complex sensations like cutting into tissue or drilling into bone – empowering students, surgeons, and other healthcare professionals to hone and perfect medical interventions before ever performing them in the clinical environment.”

Haply Robotics has produced an excellent video that gives you both a look at the product – and how it works:

 

WHAT DOES IT FEEL LIKE?

 

While at IROS, we had a chance to put our hands on the Inverse3.

In one of the simulations (which you’ll see shortly), the objective was to push a small sphere through a virtual gelatin-like substance. As you start pushing the ball against that barrier, you begin to feel resistance through the handle of the Inverse3. Using force-feedback, you continue to push and feel that resistance increase. Finally, when you’ve hit precisely the correct amount of pressure, the ball passes through the gelatin. The sensation, which included a satisfying, almost liquid ‘pop’ as the ball passed through, was amazing. It felt exactly like you would have anticipated it would feel with a real-world object.

“Touch adds a more information as opposed to just having the visual information,” explains Henry. “You also have the tactile information, so you have a rich amount of information for your brain to make a decision. You can even introduce different haptic boundaries so you can use things like AI in order to add some kind of safety measure. If the AI can say ‘don’t go there’ – it can force your hand out of the boundary with haptic cues. So it’s not just visual, it’s not just audio.”

 

SIMULATION, TRAINING…AND MORE

 

The Inverse3 is already in use for simulation training in the medical industry. In fact, many existing devices for robotic surgery do not have haptics – and there’s clearly a demand.

“Robotic surgical consoles don’t use haptics yet, and we’re hearing that surgeons are asking for that to be added because it’s missing that sense,” says Henry. “A mistake they can make is to push an instrument too far in because it’s just visual. If you had haptics on your handles, you would intuitively know to pull back.”

Remember how we tried pushing a virtual object through a gel-like substance? You’ll see that in this video around the :24 mark:

THE HAPLY STORY

 

Well, it’s not the entire Haply Robotics story, but here it is in a nutshell.

The idea for the product – for the need for such a product – first surfaced in 2016. The three co-founders were working on haptic devices at Canada’s National Research Council. Existing devices then were large and tended to not have the greatest user experience. They saw an opportunity to create something better. The company has been in business since 2018 – with these three at the helm:

  • Colin Gallacher (MEng, MSc, President)
  • Steve Ding (MEng, Electrical lead)
  • Felix Desourdy (BEng, Mechanical lead)

The trio put their heads together and – a lot of R&D later – produced the Inverse3.

The company manufactures the physical product, which contains three motors to provide haptic feedback. Haply Robotics also makes an API, but the coding for the simulations comes from outside partners. Fundamental VR, for example, is a company devoted to developing virtual training simulations for everything from opthamology to endovascular procedures. It coded that gelatin simulation.

“Studies confirm that VR significantly improves the effectiveness of medical education programs. Adding real haptics increases accuracy and delivers full skills transfer,” says the Fundamental VR website. In fact, it cites research showing a 44 per cent improvement in surgical accuracy when haptics are part of the VR experience.

“In the training space, when you’re using it for simulation, a surgeon’s work is very tactile and dexterous,” says Haply’s Jessica Henry. “We enable them to train using those instruments with the proper weights, the proper forces, that they’d encounter in surgery as opposed to textbooks or cadavers. It’s a more enriched way of interacting.”

And it really, really feels real.

Below: Haply’s Jessica Henry manipulates the Inverse3

 

 

Haply Robotics Jessica

INDRO’S TAKE

 

It’s always great discovering another new company in the robotics field, particularly one with an innovative solution like the Inverse3. It’s also great when these companies are Canadian.

“Haply Robotics has identified a clear void in the marketplace and created a solution,” says Indro Robotics CEO Philip Reece. “With the growth in remote robotics – not just surgery – I can see a wide range of use-cases for the Inverse3. Congratulations to the Haply team on being ahead of the curve.”

For more info on the product, check out the Haply Robotics website.

Assistive devices on the rise at Korea’s Robot World Conference

Assistive devices on the rise at Korea’s Robot World Conference

By Scott Simmie

A major robotics conference is underway in Seoul, South Korea.

Robot World 2023 features some 200 exhibitors and 700 booths, ranging all the way from heavy hitters like Hyundai (which makes robots for industrial purposes) through to companies that manufacture the various widgets that make up the robot supply chain. There are manufacturers of wheels, servos, end effectors, lubricants, cable management systems – you name it, you’ll find it.

Need a hand? There’s no shortage of robotic arms. While many are suited for factory and warehouse work, others are destined for the food services industry. Turn a corner and you’re more likely than not to see an arm smoothly pouring a coffee, grabbing a soft drink or snack and presenting it to an attendee.

Below: A Hyundai robot that can lift and reposition autonomobiles. It’s part of the Hyundai WIA (World Industrial Ace) division.

USE-CASES

 

The robots at this show illustrate the many use-cases. There are welding robots, pick-and-place machines, and heavy-lift AMRs (Autonomous Mobile Robots) that can lift more than a ton. Need something stacked, sorted, inspected, delivered? Want a manipulator arm you can program to start preparing French fries the moment a take-out order has been placed by an app? Need a robot to move a car?

At Robot World 2023, you’ll find all of the above – and more.

 

ASSISTIVE DEVICES

 

But there was another category of robot on display at the exhibition: Assistive medical devices. Specifically, very smart machines that can be used for patients requiring rehabilitation.

InDro Robotics, which was invited to attend Robot World 2023, was struck by the number of companies with products in this sector. There were ground robots – friendly-looking devices that keep an eye on vulnerable people and can call for assistance if there’s a fall or some other crisis. But more intriguing to us were machines that can play a role – both physiologically and psychologically – in helping to rehabilitate someone from a serious injury or other challenging condition.

Below: A shape-shifting wheelchair wheel can climb stairs

Robot World Microsurgery

RE-LEARNING TO WALK

 

Like any major convention, exhibitors range from established global companies like Hyundai all the way to smaller startups with a great idea. And one that caught our attention is a company called Astrek Innovations. Its CEO and co-Founder is Robin Kanattu, a young engineer from Kerala in southern India.

“We are mainly focussing on building and designing products for the 20 per cent of people who are suffering from disability and accessibility issues,” says Robin. “One of the products is the lower limb exoskeleton, for people who are suffering form lower limb disabilities.”

As the company’s website explains:

“Established in 2018, we develop cutting edge solutions to some of our most complex problems – Disability and Rehabilitation. Leveraging our knowledge and expertise in robotics, machine learning and motion capture, we design devices that would transform the current state-of-the-art in the rehabilitation and assistive technology arena.

“Our magnum opus is a wearable robotic device, an exoskeleton, that would help people with lower-limb immobility walk again. A culmination of motorised limb braces, motion capturing & tracking; and machine learning; this device would transform rehabilitation into a precise, immediate treatment protocol.”

Established in 2018, the company has been building and testing versions of this product for four years.

“Now we have a final version, and we wanted to provide independence for people who are suffering form these disabilities,” he says.

A lot of research has gone into this product. Robin says a great deal of groundwork was spent capturing data on healthy people: How they walk, how they sit, how gaits alter during the course of a stride.

“Now we use that same data to predict the walking pattern of users, so they will have much more stable walking and standing while using the device.”

The exoskeleton provides support and strength and moves the legs. Forward-facing crutches are used to aid in stability. The product can be used on someone who is paralysed from the waist down, people recovering from strokes, those with certain genetic issues and people recovering from accidents.

 

HOW THE IDEA WAS BORN

 

Robin is an electrical engineer. But there was a personal motivation to put his skills to use in this arena.

“My grandfather had this issue. After having an accident, he was not able to walk properly. And after doing knee replacement surgery he was not able to walk again,” he explains. “So that’s how our team came togeher.”

Astrek has been recognized for product excellence at Robot World 2023, and Korea has brought the company in on a program called the K Startup Grand Challenge. Robin has been working in Korea on streamlining the manufacturing chain, working with mentors and looking for collaboration.

But the product, he says, is fully functioning. And people who are paralyzed from the waist down have been able to walk with it.

“Psychologically, they are so happy,” he says. “Their sole dream is to walk again, and we are happy to see them doing that.”

Robin did not have the prototype at the show because of red tape involving flying the batteries to Seoul. He’s pictured below with a banner showing the device.

Robin Kanattu Astrek

ROBOT REHABILITATION

 

Another company, RpiO, has already cracked the market. Its R-BoT plus is a device designed for people with central nervous system damage (including stroke, paraplegia, spinal cord injury etc.). It’s more of a rehab device designed for hospital settings, but allows users to exercise lower extremities while lying down or standing upright. The product is approved by a Korean regulatory body (Korean Ministry of Food and Drug Safey, formerly the KFDA), and the company has already sold seven units inside Korea.

“We have major hospitals, locals hospitals and private hospitals who are using the machine with people who have damage impacting their lower body,” explains CEO Jay Moh.

“Because KFDA is a standard in Southeast Asia, we are starting to sell in Hong Kong, Malaysia and Singapore. Many doctors have come to see our robots.”

The R-BoT plus works in three modes: Passive, active and resistive – depending on the patient’s abilities. What sets this device apart is that the person exercising watches a large-screen display during rehabilitation sessions. The display features outdoor scenes, and with every ‘step’ made, a footprint appears on the ground and the patient has a visual cue that they’re making progress. Distance covered, calories burned and heart rate are all displayed as well, providing further incentive.

“Once the machine starts, they look at the display,” he says. “This has been medically tested; this stimulates the brain and releases a chemical that stimulates recovery. People feel better – they enjoy the workout and feel like they’re walking through the grass.”

For those ready to actually move in the real world, the company also has a product called EXOwalk. Here, an exoskeleton is strapped to the patient’s limbs and can help move their legs (again, in multiple modes). But this exoskeleton is fixed to a rolling robotic platform – meaning the patient actually moves forward on the ground, rather than being fixed to a static machine.

“This is driven – so they actually move along the hallway in the facility.”

 

EXO Motion

 

For patients with upper limb motor impairments, the company has developed a product called EXO motion. This is strictly a portable exoskeleton device that attaches to the arm. In active mode, it detects myolectric signals from the user’s arm and – with some sophisticated algorithms and mechatronics – converts those signals into mechanical motion that moves the arm.

In addition to these robotic devices, RpiO also is a leading company in software designed to help people with dementia.

“We have a high population of elderly people who suffer with this,” says Moh. “So the market is growing very fast.”

Below: CEO Jay Moh, followed by the R-BoT plus and display. Note the footprints…

 

Robot World Korea R-BoT plus
R-BoT plus display

INDRO’S TAKE

 

We enjoyed checking out these devices at Robot World 2023 – and were pleased to see yet more evidence of #robotsforgood.

“Robots can be tremendous tools on their own,” says InDro Robotics CEO Philip Reece. “But there’s something truly special about products designed to directly help human beings improve their mobility and health. We applaud the inventors and engineers who develop these products, and look forward to even more assistive device breakthroughs in future.”

And a final note: The feature image at the top of this story shows some very, very, tiny arms used for microsurgery. InDro was able to take a run at the controls (pictured below). It took some patience, but we were able to grasp an impossibly small elastic band.

Now picture a highly skilled microsurgeon operating on someone remotely.

It’s happening now, thanks to robotics.

Robot World Microsurgery
Rockwell Automation to purchase Clearpath Robotics

Rockwell Automation to purchase Clearpath Robotics

By Scott Simmie

 

There’s some big news in the Canadian robotics world.

US-based Rockwell Automation, which describes itself as “the world’s largest company dedicated to industrial automation and digital transformation” has announced it has signed an agreement to purchase Canadian company Clearpath Robotics.

Clearpath is known for its Autonomous Mobile Robots (AMRs), many of which are designed to move heavy loads inside warehouses. In fact, Clearpath has an entire division – OTTO Motors  which specialises in AMRs, along with software for fleet management and navigation.

These are the kinds of vehicles we’re talking about – which can clearly aid in efficiency:

THE NEWS

 

Word of the planned acquisition case in a September 5 news release from Rockwell Automation. 

“Rockwell Automation, Inc. (NYSE: ROK)…today announced it has signed a definitive agreement to acquire Ontario, Canada-based Clearpath Robotics Inc., a leader in autonomous robotics for industrial applications. Autonomous mobile robots (AMRs) are the next frontier in industrial automation and transformation, and this acquisition will supercharge Rockwell’s lead in bringing the Connected Enterprise to life.”

If you’ve been following the robotics world in the past few years, you’ll be aware that the use of robotics has gone far beyond industrial arms welding car frames or lifting parts into place. Robots have increasingly been deployed to warehouses and other industrial settings to increase efficiency and reduce repetitive and arduous manual labour for human beings. Moving, packing and tracking have become huge – and an increasingly integral part of the supply chain and inventory management. Rockwell Automation clearly sees OTTO Motors as part of its solution going forward:

“Transporting parts and materials to assembly lines and between manufacturing cells is one of the industry’s most complex and inefficient tasks, often resulting in production bottlenecks,” states the release.

“Autonomous production logistics will transform the workflow throughout a manufacturing plant, enabling substantial reductions in cost and greater operational efficiency…Combined with Rockwell’s strong continuing partnerships in fixed robotic arms, solutions such as Independent Cart Technology, and traditional leadership in programmable logic controllers (PLCs), the addition of OTTO Motors’ AMR capabilities will create a complete portfolio of advanced material handling solutions unmatched in the industry.”

The release seems to make it clear that Rockwell Automation sees OTTO Motors as the jewel in the crown. Here’s another look at some of the OTTO Motors AMR solutions:

A GROWING MARKET

 

The news release cites research from Interact Analysis, which points to strong growth in this field in the coming years. Demand for AMRs in manufacturing, says the release, is slated to grow at 30 per cent annually over the next five years, “with an estimated market size of $6.2 billion by 2027.”

“Rockwell and Clearpath together will simplify the difficult and labor-intensive task of moving materials and product through an orchestrated and safe system to optimize operations throughout the entire manufacturing facility,” said Blake Moret, Chairman and CEO, Rockwell Automation.

“The combination of autonomous robots and PLC-based line control has long been a dream of plant managers in industries as diverse as automotive and consumer packaged goods. With Clearpath, Rockwell is uniquely positioned to make that dream a reality across virtually all discrete and hybrid verticals, optimizing planning, operations, and the workforce.”

Clearpath is said to have about 300 employees, with the majority working within the OTTO Motors division. And, not surprisingly, is pleased with the news.

“Industrial customers are under ever-increasing pressure to do more with less. Autonomous production logistics is becoming a necessity to meet targets and stay competitive,” says Matt Rendall, co-founder and CEO of Clearpath.

“We are excited to join Rockwell and help expand their leadership position in advanced material handling. Together, we will create safer and more productive workplaces with autonomous technology.”

Indro Robotics Vice-President Peter King, who previously worked at Clearpath, has this to say about the acquisition.

“It’s a great opportunity for Rockwell to take on an industry leader in this space at a time when AMRs are about to become the norm,” says King. “Rockwell’s size and market penetration should bode well for global growth.”

The news release goes on to explain how the two companies are a natural fit:

“Data from Rockwell’s offerings and OTTO Motors’ AMRs will be harnessed in artificial intelligence-powered Software as a Service information management applications, such as those by Rockwell’s Plex and Fiix businesses,” it states.

“With this, Rockwell will deliver a unified solution for manufacturing, enabling autonomous execution and optimization to increase efficiency and allow for traceability and real-time adjustments.

INDRO’S TAKE

 

There aren’t a whole lot of Canadian companies manufacturing robots – let alone with multiple offerings aimed at the warehouse/industrial sector. Clearpath was an early leader on the Canadian robotics scene, and its OTTO Motors division produces some impressive offerings.

“Clearpath got into ground robotics early – and over time really carved out a niche for itself, particularly with OTTO Motors,” says Indro Robotics CEO Philip Reece.

“Robotics is a highly competitive space these days. Rockwell Automation clearly sees some synergy here with its own products and clients. We congratulate Clearpath on this acquisition, and look forward to what we assume will be continued success in the AMR market.”

Clearpath Robotics was founded in 2009 and launched its OTTO Motors division in 2015. Rockwell Automation is headquartered in Milwaukee, Wisconsin, and employs 28,000 people serving clients in more than 100 countries.

A Q&A with Real Life Robotics CEO Cameron Waite

A Q&A with Real Life Robotics CEO Cameron Waite

By Scott Simmie

 

There’s a new robot – and a new robotics company – in town.

Real Life Robotics, founded by CEO Cameron Waite, is a cargo and last-mile robotics delivery firm. Its deployments utilise products developed by InDro Robotics intended for both autonomous and tele-operated missions over 4G and 5G networks. These robots, customisable for client-specific applications, are designed for long-range and large payloads.

Its first workhorse, currently doing demos for potential clients, is a unit Real Life calls BUBS. Developed by InDro, the robot is a second-generation delivery machine with impressive payload capability and a lockable payload door that opens upon reaching its destination. It’s suitable for a wide variety of use-cases and can be modified for client-specific needs.

BUBS is packed with features, including:

  • A total of six cameras, including two sets of depth perception cameras at the front and rear for greater situational awareness for the operator
  • LED running lights, signal lights, brake lights
  • Large cargo bay (50kg capacity) that can be opened and closed remotely
  • Greater all-weather protection and a touchscreen interface for customers

Just as the Artist Formerly Known as Prince had a name change, so too has BUBS. InDro’s and Real Life Robotics’ earlier development name for the machine was ROLL-E 2.0. It went through successful trials with London Drugs in Surrey, BC, for home deliveries.

 

Below: Bubs in action during trials in BC for London Drugs

THE INDRO-REAL LIFE CONNECTION

 

Like most business relationships, this one began with a conversation. Specifically, a chat between InDro CEO Philip Reece and Cameron Waite in 2021. Just as InDro has deep expertise in robotics R&D, Waite has a high-level background in robotics sales and customer success. With more than two decades of experience in hardware sales, Waite was an early hire at Canadian success story Aeryon Labs, a pioneering UAV firm that was acquired by FLIR for $200M.

Waite was responsible for global sales with Aeryon, engaging with clients ranging from police and First Responders to defence, commercial inspection, “and everything in-between,” he says. He learned a lot about different use-case scenarios for the product – as well as the ability of engineers to customise capabilities of the core product line to meet the needs of clients.

From there, he went on to a similar position at Avidbots. He was, again, a very early-stage hire with significant responsibilities for helping grow the company.

“There was no revenue there prior to my joining,” he says. “I was the first sales hire at Aeryon; I was the first sales hire at Avidbots.”

So, what did he do at Avidbots?

“I was able to engage with some of the largest companies in the world – everyone from DHL to Walmart. I spent five years there, growing that organization, hiring and managing staff, sales people, but also being involved in customer success and support and product and product development. Making sure that the feedback from the clients in the field was making its way to the team to build and develop and modify and tweak that robot to better fit the needs of our clients so that our clients would continue to want to scale.”

We interviewed Waite (obviously) about his plans for Real Life Robotics, and the kind of clients that might be a solid fit for BUBS.

Below: Real Life Robotics CEO/Founder Cameron Waite in conversation with Scott Simmie

Cameron Waite RealLife Robotics

QUESTIONS. AND ANSWERS

 

So let’s get into the Q & As.

Q: Tell us about BUBS

A: BUBS is a second-generation delivery robot. It is a large unit relative to what we typically would see out in the market. It uses a suite of sensors onboard to give it spacial and environmental awareness. We use systems like the NVIDIA Nano to process data that the robot sees in real time, which allows it to make its own autonomous decisions to navigate through the world.

The robot is large-capacity and has a locking, remotely operated lid that allows it to securely hold and transport whatever it is our customers are interested in moving. It’s a large-enough system that it can hold food, beverages, product from a store, pharmaceuticals or lab samples from a hospital, dirty linens in a long-term care facility, product on a golf course – really anything that needs to be transported from Point A to Point B.

It has indoor and outdoor capabilities, so it is weather-resistant. It can handle some pretty significant slopes and terrain. The robot, in addition to its autonomy and sensor package, utilises a radio system that allows us to have remote assistance or piloting as necessary through a WiFI connection, a 4G/5G connection, or private network. We currently use the Rogers 5G network as our backbone, and from a teleoperation or semi-autonomous perspective, we can have an operator located anywhere on planet earth and have that person remotely assist or operate the robot as necessary with under a 1/10th of a second latency. So it’s near real-time using EDGE computing.

 

IS BUBS DESIGNED FOR AUTONOMOUS OPS OR TELE-OPERATION?

 

That was the question. Here’s the answer.

A: The answer depends on the application itself. So some environments where we have a high degree of predictability and can pre-map and understand that environment, those are environments that are more conducive to fully autonomous operations. The robot can be trained using Computer Vision and AI to autonomously navigate through an entire space if that space is predictable.

Alternatively, if there is a high degree of variability, or there are safety or regulatory concerns that require a human in the loop, we have that option as well. So, for example, if a robot was to be traveling in a city environment and it needed to cross a road – that’s a complex procedure for any robot to do. And there’s likely a degree of human interface that would be beneficial to have that robot determine when and where it’s safe to cross the road. Or if a path was blocked by a large-enough obstacle, and the robot needed to exit a geofence that is pre-programmed into that operation, in order to safely manoeuevre around that obstacle, it’s likely a good idea to have a human in the loop to make that complex decision.

The more repetitive times that type of an application happens, the more a robot can be trained to autonomously execute those types of scenarios. As that robot’s deployment increases over time, the human interface required decreases. But there will always be some level of human in the loop.

 

WHAT ABOUT REGULATORY ENVIRONMENTS AND ADOPTION?

 

A: Over the next 10 years, we will see an enormous increase in the reliance on robotics to do basic things like delivery inside municipal environments. One of the things I learned at Aeryon years ago was the importance of engaging with government early on, because government can otherwise potentially shut down your operations at a really inopportune time. And so Real Life Robotics has already engaged with a number of Canadian cities and had early approvals to allow our robots to drive around in certain automation projects in city environments.

Municipalities typically have concerns around full automation and Level 4, Level 5 autonomy. If Elon Musk and his team are not able to get approvals to drive around in downtown Toronto, how do we think we’re going to get the same approvals to drive around autonomously? We’re not. So the cities have actually really embraced the fact that our robots can have a human in the loop to make some of those difficult decisions. That helps alleviate some of the concerns around full autonomy. But we have spent the time building the groundwork to allow us to operate in their environments and we, in return, intend to work very closely with those cities to actually build the playbook, and build the ruleset and the framework around successful and safe deployment of robots in urban environments.

 

CAN BUBS BE MODIFIED FOR CLIENTS?

 

A: Absolutely. Our mandate is to commercialise robots. And as part of any startup growth plan, sometimes there are pivots along the way that you need to make. But in general, a client that has a real ROI potential where robots can facilitate that, and a client that has the potential for scale, that’s our expertise. With the combined benefit of having InDro, we can not only develop a very specific robot solution to solve a customer’s immediate concerns or challenges, we can also scale that robot.

Q: Why did you feel InDro was the best fit for a partner?

A: In general, InDro would be considered a world-class R&D company – hands down, bar none. And that’s why we partnered with them. The firm has an enormous skillset, including expertise with autonomy, sensor fusion and integration. Because the company has all off that, plus a large engineering staff, we’re lucky to call InDro a true partner. InDro’s capabilities and agility will help speed the path of Real Life – and our clients – to commercialisation.

Below: BUBS in action during a pilot project in Surrey, BC

Delivery Robot

SPREADING THE NEWS

 

Real Life Robotics issued a news release on its partnership with InDro, which you can find here. But we’ll take the liberty to borrow a section from it:

“The ground robotic delivery market is still very new,” explains Waite. “We engage with both commercial/industrial and government clients who want to lead the charge in adoption of this exciting technology we’ve created.”

Using a combination of hardware, software, and artificial intelligence, Real Life Robotics’ flagship product, called BUBS™, provides cargo and delivery automation at scale. The company’s unique Robot-As-A-Service model approach provides clients with a white-labelled, customized robot at an accessible cost, allowing businesses to realize immediate top line and bottom line impact.

“Businesses in manufacturing, retail, healthcare, agriculture and food services sectors among others can utilize BUBS™ for a variety of last-mile delivery applications with BUBS™ providing immediate solutions to labor shortages, as well as cost savings, labor efficiencies optimization while driving additional new revenue streams.

InDro and Real Life Robotics will work closely together to enhance robot offerings, as well as identify new ways of collaborating in a fast-growing marketplace.”

INDRO’S TAKE

 

We’re obviously equally pleased with this partnership – and are eager to build and customise ground robot solutions for the clients of Real Life Robotics.

“Cameron Waite has deep expertise in sales and support of aerial and ground robotics, along with customer success,” says InDro CEO Philip Reece. “We look forward to creating custom solutions at scale for Cam’s clients. Real Life Robotics is the right company, with the right leader, at the right time.”

You can learn more about Real Life Robotics here. And you can reach CEO Cameron Waite here.

CONTACT

INDRO ROBOTICS
305, 31 Bastion Square,
Victoria, BC, V8W 1J1

P: 1-844-GOINDRO
(1-844-464-6376)

E: Info@InDroRobotics.com

copyright 2022 © InDro Robotics all rights reserved

InDro attends Robotics Summit & Expo in Boston

InDro attends Robotics Summit & Expo in Boston

By Scott Simmie

 

There’s nothing like a little trip to Boston at this time of year. Especially when the annual Robotics Summit and Expo is on.

InDro dispatched Account Executive Luke Corbeth and Head of Strategic Innovations Stacey Connors to the show, along with a number of devices either manufactured or distributed by InDro Robotics.

And it was busy. So busy, that it produced a quote we never anticipated.

“I only had time for one pee break all day and didn’t stop talking,” laughs Corbeth.

Between demonstrating a dog-like robot and other devices, speaking with attendees and potential clients, Corbeth says the tempo was absolutely surreal – with a steady stream of people at the InDro booth wanting to learn more about the company and its solutions.

“Honestly our booth was too busy,” adds Connors. “We needed two of us there, manning it nonstop.”

But that’s a good problem to have.

There was a large number of startups in attendance, as well as engineering students, professors, and others from the world of robotics, robotic medicine/surgery and academia. In conversations, Corbeth says many were keen to learn of InDro’s expertise as an integrator.

“A lot of people told us they were having difficulty building their own hardware. They really tinker with hardware and struggle with integration. For someone trying to build an autonomous inspection solution, for example, it can be challenging to focus on what you actually want to do achieve if you’re spending so much time on the hardware.”

Below: Team InDro during a microsecond when the booth wasn’t swamped:

Robotics Summit

PLENTY OF GEAR

 

InDro took a number of products the company has developed or distributes to the show. The Unitree GO1 EDU, seen in the photo above, was a big hit. But there was plenty of interest in InDro products including our new indoor drone – which has capabilities not available with standard commercial drones.

“It’s a ROS-based drone,” explains Corbeth. “It has compute onboard, a depth camera, a 4K camera, and 5G connectivity for remote teleoperations. With a standard DJI drone you don’t have the same ability to develop autonomous and custom applications. But ours can be programmed in ROS (Robot Operating System), which enables different sorts of projects that off-the-shelf drones just can’t do.”

There were plenty of engineering students – many specialising in robotics and mechatronics – at the Expo. It’s a sign, if any were needed, indicative of the massive growth in the industry.

“The students really see that,” says Corbeth. “So they put their efforts into learning how to design and build and improve these types of robots and want to be part of that going forward.”

 

MASSROBOTICS

 

Massachusetts has a thriving robotics community, including more than 400 companies that build or utilise robotic solutions. And there was a big presence at the show from MassRobotics, a non-profit innovation hub, accelerator and incubator for robotics and connected devices startups. It offers support and expertise as entrepreneurs move from envisioning a solution through to eventual production and commercialisation. The organisation also frequently teams with industry partners to issue robotics challenges, where university teams try to fulfil specific challenges in order to win cash prizes.

“We help bridge the gap and connect our startups to potential customers and investors, offering facilities and platforms to showcase their technology,” says its website.  “MassRobotics’ mission is to help create and scale the next generation of successful robotics technology and connected devices companies by providing innovative entrepreneurs and startups with the workspace and resources they need to develop, prototype, test and commercialize their products and solutions.”

Because of that mission, there was real interest in InDro’s capabilities, says Corbeth:

“They showcased a number of student-led projects, so it was nice to see what the academic world is building. They also seemed intrigued at the idea of jump-starting their projects with InDro’s integration abilities.”

InDro also told MassRobotics staff about the forthcoming drone and robot advanced training, testing and evaluation site coming soon to Area X.O.

“They lit up when I talked about the testing site at Area X.O,” says Connors, who is hoping to arrange potential collaboration between MassRobotics, Area X.O and Invest Ottawa. “It’s all about opening doors.”

Below: An image from the MassRobotics website, showing some of the 400 companies manufacturing or utilizing robotics in Massachusetts.

 

MassRobotics Boston robot companies

CLIENT VISITS

 

For Luke Corbeth, who drove down with a car absolutely jammed with robots and a drone, the Robotics Summit & Expo was just part of a very busy week. He also visited clients at the University of Massachusetts, as well as Boston University’s College of Engineering – which has purchased a fleet of Limo R&D robots. (The department is apparently doing research involving using the robots in collaborative swarms.)

It was an opportunity he welcomed.

“In the post-Covid era, a lot of interactions are online so it’s nice to actually meet the clients face-to-face, hear about their problems and successes and use that feedback to better service them and improve our products.”

And a personal highlight for Luke? A booth visit from Aaron Prather, Director of the Robotics & Autonomous Systems Program at ASTM International. Prather is followed by nearly 40,000 people on LinkedIn, where he posts prolifically on developments in the field of robotics.

“I feel like I met the Michael Jackson of robotics,” says Corbeth.

Robotics Summit Luke Corbeth Aaron Prather

INDRO’S TAKE

 

We were pleased to make some connections – and likely some sales – at the Robotics Summit & Expo. We were also pleased to see the immense interest in what InDro does (something we covered at length in a post here).

“While conferences often bring sales, sometimes exposure and making new connections are just as valuable – or more,” says InDro Robotics CEO Philip Reece.

“We’ve developed many partnerships that began as simple conversations at events like these, and we look forward to building more.”

We generally give advance notice when we’re attending conferences via LinkedIn and Twitter. Give us a follow and stay up to date on InDro developments.

Still a long road to fully autonomous passenger cars

Still a long road to fully autonomous passenger cars

By Scott Simmie

 

We hear a lot about self-driving cars – and that’s understandable.

There are a growing number of Teslas on the roads, with many owners testing the latest beta versions of FSD (Full Self-Driving) software. The latest release allows for automated driving on both highways and in cities – but the driver still must be ready to intervene and take control at all times. Genesis, Hyundai and Kia electric vehicles can actively steer, brake and accelerate on highways while the driver’s hands remain on the wheel. Ford EVs offer something known as BlueCruise, a hands-free mode that can be engaged on specific, approved highways in Canada and the US. Other manufacturers, such as Honda, BMW and Mercedes, are also in the driver-assist game.

So a growing number of manufacturers offer something that’s on the path to autonomy. But are there truly autonomous vehicles intended to transport humans on our roads? If not, how long will it take until there are?

Good question. And it was one of several explored during a panel on autonomy (and associated myths) at the fifth annual CAV Canada conference, which took place in Ottawa December 5. InDro’s own Head of Robotic Solutions (and Tesla owner) Peter King joined other experts in the field on the panel.

 

Autonomous Cars

Levels of autonomy

 

As the panel got underway, there were plenty of acronyms being thrown around. The most common were L2 and L3, standing for Level 2 and Level 3 on a scale of autonomy that ranges from zero to five.

This scale was created by the Society of Automotive Engineers as a reference classification system for motor vehicles. At Level 0, there is no automation whatsoever, and all aspects of driving require human input. Think of your standard car, where you basically have to do everything. Level 0 cars can have some assistive features such as stability control, collision warning and automatic emergency braking. But because none of those features are considered to actually help drive the car, such vehicles remain in Level 0.

Level 5 is a fully autonomous vehicle capable of driving at any time of the day or night and in any conditions, ranging from a sunny day with dry pavement through to a raging blizzard or even a hurricane (when, arguably, no one should be driving anyway). The driver does not need to do anything other than input a destination, and is free to watch a movie or even sleep during the voyage. In fact, a Level 5 vehicle would not need a steering wheel, gas pedal, or other standard manual controls. It would also be capable of responding in an emergency situation completely on its own.

Currently, the vast majority of cars on the road in North America are Level 0. And even the most advanced Tesla would be considered Level 2. There is a Level 3 vehicle on the roads in Japan, but there are currently (at least to the best of our knowledge and research), no Level 3 vehicles in the US or Canada.

As consumer research and data analytics firm J.D. Power describes it:

“It is worth repeating and emphasizing the following: As of May 2021, no vehicles sold in the U.S. market have a Level 3, Level 4, or Level 5 automated driving system. All of them require an alert driver sitting in the driver’s seat, ready to take control at any time. If you believe otherwise, you are mistaken, and it could cost you your life, the life of someone you love, or the life of an innocent bystander.”

To get a better picture of these various levels of autonomy, take a look at this graphic produced by the Society of Automotive Engineers International.

Autonomy

Now we’ve got some context…

 

So let’s hear what the experts have to say.

The consensus, as you might have guessed, is that we’re nowhere near the elusive goal of a Level 5 passenger vehicle.

“Ten years ago, we were all promised we’d be in autonomous vehicles by now,” said panel moderator Michelle Gee, Business Development and Strategy Director with extensive experience in the automotive and aerospace sectors. Gee then asked panelists for their own predictions as to when the Level 4 or 5 vehicles would truly arrive.

“I think we’re still probably about seven-plus years away,” offered Colin Singh Dhillon, CTO with the Automotive Parts Manufacturers’ Association.

“But I’d also like to say, it’s not just about the form of mobility, you have to make sure your infrastructure is also smart as well. So if we’re all in a bit of a rush to get there, then I think we also have to make sure we’re taking infrastructure along with us.”

Autonomous Cars

It’s an important point.

Vehicles on the path to autonomy currently have to operate within an infrastructure originally built for human beings operating Level 0 vehicles. Such vehicles, as they move up progressive levels of autonomy, must be able to scan and interpret signage, traffic lights, understand weather and traction conditions – and much more.

Embedding smart technologies along urban streets and even on highways could help enable functionalities and streamline data processing in future. If a Level 4 or 5 vehicle ‘knew’ there was no traffic coming at an upcoming intersection, there would be no need to stop. In fact, if *all* vehicles were Level 4 or above, smart infrastructure could fully negate the need for traffic lights and road signs entirely.

 

Seven to 10 years?

 

If that’s truly the reality, why is there so much talk about autonomous cars right now?

The answer, it was suggested, is in commonly used – but misleading – language. The term “self-driving” has become commonplace, even when referring solely to the ability of a vehicle to maintain speed and lane position on the highway. Tesla refers to its beta software as “Full Self-Driving.” And when consumers hear that, they think autonomy – even though such vehicles are only Level 2 on the autonomy scale. So some education around langage may be in order, suggested some panelists.

“It’s the visual of the word ‘self-driving’ – which somehow means: ‘Oh, it’s autonomous.’ But it isn’t,” explained Dhillon. “…maybe make automakers change those terms. If that was ‘driver-assisted driving,’ then I don’t think people would sleeping at the wheel whilst they’re on the highway.”

One panelist suggested looking ahead to Level 5 may be impractical – and even unnecessary. Recall that Level 5 means a vehicle capable of operating in all conditions, including weather events like hurricanes, where the vast majority of people would not even attempt to drive.

“It’s not safe for a human to be out in those conditions…I think we should be honing down on the ‘must-haves,’ offered Selika Josaih Talbott, a strategic advisor known for her thoughtful takes on autonomy, EVs and mobility.

“Can it move safely within communities in the most generalised conditions? And I think we’re clearly getting there. I don’t even know that it’s (Level 5) something we need to get to, so I’d rather concentrate on Level 3 and Level 4 at this point.”

 

Autonomous Cars

InDro’s Peter King agrees that Level 5 isn’t coming anytime soon.

I believe the technology will be ready within the next 10 years,” he says. “But I believe it’ll take 30-40 years before we see widespread adoption due to necessary changes required in infrastructure, regulation and consumer buy-in.”

And that’s not all.

“A go-to-market strategy for Level 5 autonomy is a monumental task. It involves significant investments in technology and infrastructure – and needs to be done so in collaboration with regulators while also factoring in safety and trust from consumers with a business model that is attainable for the masses.”

What about robots?

Specifically, what about Uncrewed Ground Vehicles like InDro’s Sentinel inspection robot, designed for monitoring remote facilities like electrical substations and solar farms? Sentinel is currently teleoperated over 4G and 5G networks with a human controlling the robot’s actions and monitoring its data output. 

Yet regular readers will also know we recently announced InDro Autonomy, a forthcoming software package we said will allow Sentinel and other ROS2 (Robot Operating System) machines to carry out autonomous missions.

Were we – perhaps like some automakers – overstating things?

“The six levels of autonomy put together by the SAE are meant to apply to motor vehicles that carry humans,” explains Arron Griffiths, InDro’s lead engineer. In fact, there’s a separate categorization for UGVs.

The American Society for Testing and Materials (ASTM), which creates standards, describes those tiers as follows: “Currently, an A-UGV can be at one of three autonomy levels: automatic, automated, or autonomous. Vehicles operating on the first two levels (automatic and automated) are referred to as automatic guided vehicles (AGVs), while those on the third are called mobile robots.”

“With uncrewed robots like Sentinel, we like to think of autonomy as requiring minimal human intervention over time,” explains Griffiths. “Because Sentinel can auto-dock for wireless recharging in between missions, we believe it could go for weeks – quite likely even longer – without human intervention, regardless of whether that intervention is in-person or virtual,” he says.

“The other thing to consider is that these remote ground robots, in general, don’t have to cope with the myriad of inputs and potential dangers that an autonomous vehicle driving in a city must contend with. Nearly all of our UGV ground deployments are in remote and fenced-in facilities – with no people or other vehicles around.”

So yes, given that InDro’s Sentinel will be able to operate independently – or with minimal human intervention spread over long periods – we are comfortable with saying that machine will soon be autonomous. It will even be smart enough to figure out shortcuts over time that might make its data missions more efficient.

It won’t have the capabilities of that elusive Level 5 – but it will get the job done.

InDro’s take

 

Autonomy isn’t easy. Trying to create a fully autonomous vehicle that can safely transport a human (and drive more safely than a human in all conditions), is a daunting task. We expect Level 5 passenger vehicles will come, but there’s still a long road ahead.

Things are easier when it comes to Uncrewed Ground Vehicles collecting data in remote locations (which is, arguably, where they’re needed most). They don’t have to deal with urban infrastructure, unpredictable drivers, reading and interpreting signage, etc.

That doesn’t mean it’s easy, of course – but it is doable.

And we’re doing it. Stay tuned for the Q1 release of InDro Autonomy.