By Scott Simmie
Area X.O, the Ottawa facility founded and operated by Invest Ottawa that houses cutting-edge companies involved in robotics and smart mobility R&D, has unveiled a powerful new tool.
It’s a simulation portal that will allow firms to virtually test products under development. Want to put a robot through its paces on the roads at Area X.O to evaluate its propulsion system and battery life? Have a drone overfly and capture data? Maybe you want to test in snow and cold temperatures, despite it being summertime?
Unless you happen to be an Area X.O tenant, carrying out any of these tasks in real life would involve getting permission, getting your product to the site – even waiting for months and taking multiple trips if you wanted to test under a variety of weather conditions. The costs on this would quickly add up, and your development time would stretch.
With the new simulator, you can put your robot or drone (or sensor) through their paces remotely – whether you’re in Ottawa, Vancouver, or even further afield. And you can use the data gathered in the simulator to improve and refine your real-world product.
“Until recently, Area X.O was limited to the physical world,” said Patrick Kenny, Senior Director of Marketing and Communications for Invest Ottawa, Area X.O and Bayview Yards.
“This past winter, Area X.O launched a simulation discovery portal powered by Ansys. The simulation portal and program promotes simulation and its ability to reduce time, cost, effort and risk by getting breakthrough innovations to market faster. Innovators now have a new option to consider.”
Kenny made his remarks during a June 7 webinar. During that event, Area X.O engineers Barry Stoute and Hossain Samei explained how the system works – and even carried out a real-time demonstration.
POWERED BY ANSYS
The brains behind the system come from Ansys, which has been in the simulation software business for more than 50 years. It is widely considered to be the most powerful software of its kind.
“Simulation is an artificial representation of a physical model,” explained simulation engineer Dr. Stoute. He went on to explain, at a high level, two different types of simulation: Finite Element Analysis (FEA) and Digital Mission Engineering.
In a nutshell, FEA uses software (and really good computers) to see how different models behave under different conditions. The model can be anything: A robot, an antenna, a drone – you name it.
“Finite Element Analysis solves for mechanical structures, thermal analysis, electronics and optical (components),” explained Dr. Stoute. Want to know what temperature a component might heat to under load? Determine how a transmitter or antennae might behave in differing temperatures? Even “see” what an optical sensor might capture when mounted on a robot? Plug in the right parameters and powerful computing will give the answer.
DIGITAL MISSION ENGINEERING
This type of simulation is a way of designing a complex system, particularly where multiple assets interact with another in a simulated environment. In the example seen below, Dr. Stoute says a digital mission engineer could create a model where a drone capturing data interacts with multiple objects. These include satellite communications, a ground station, along with multiple vehicles. The drone’s mission is to capture data from the ground, but the engineer is interested in seeing the Big Picture – the ways in which all these different assets will interact.
The mission engineer can select and modify the parameters of every asset in that model. How powerful is the ground station and what range will it provide? What speed is the aircraft flying at, and at what altitude. What type of aircraft is it? What sensors are on the drone and what are their specifications? What is the battery life? What are the specifications of the drone’s motors? The ambient temperature and wind conditions?
The options are dizzying. But the software – along with a well-trained mission engineer – can create a virtual world where the data outcomes closely predict what would happen in a real-world mission.
“If an engineer creates a physical product and it doesn’t work as planned, they have to go back and remodel it,” explained Dr. Stoute. The simulation environment, by contrast, allows the engineer to tweak that product in a virtual environment without the expense of real-world modifications. Once the product is working well in simulation, those learnings can be applied to the actual physical product.
Plus, of course, weather parameters can easily be changed; something impossible in real-world testing (unless you’ve got lots of time on your hands).
“Should he wait until January to get a blizzard to test the product?” asked Dr. Stoute.
“No, it doesn’t make sense. The simulator can simulate blizzard conditions.”
Below: Dr. Stoute explains how Digital Mission Engineering works during the webinar
Now that the basics were explained, the webinar moved on to demonstrate these concepts. Area X.O engineer Hossain Samei took over the controls, doing a real-time demo of the sim’s capabilities.
For this, Samei used not only the Ansys core system, but another powerful piece of software called Ansys AVxcelerate, which is used to test and validate sensors for self-driving cars. That means you can plug in virtual sensors, including all of their technical parameters, into the system. And not simply the sensors on the cars. In this simulation, which features a very high-resolution 3D map of the Area X.O complex, Hossain also had sensors that are on the Area X.O site embedded into this virtual world.
“This digital twin also includes the infrastructure embedded into our smart city zone,” explained Samei. “This includes multiple sensors, optical cameras, roadside units, thermal cameras and LiDAR cameras.” The model even includes functioning railroad crossing gates.
“We’re able to simulate the arms moving up and down,” he said.
And remember how the Ansys system can simulate weather? The mission engineer can also tailor lighting conditions – very useful for testing visual sensors.
VIRTUAL TEST DRIVE
Samei already had the digital twin of Area X.O defined. He then quickly put together an autonomous vehicle and camera sensor using AVxcelerate.
“Once we have our car defined, as well as the sensors on the vehicle, we’re able to move on to choosing a car simulator,” said Hossain.
In order to help the car drive on Area X.O’s terrain, Hossain turned to the Open-Source Webots robot simulator.
“With WeBots, you can define your vehicle, including its suspension, power train and other features to define the vehicle dynamics of the car,” said Samei.
And now? It was time for a drive.
Samei began to pilot the car around Area X.O – showing as well that he could change the setting from a clear and dry day to one with snow on the ground with just a few clicks. As the car drove down the road, you could see some of the Smart City sensors that are physically (and virtually) embedded in the Area X.O environment.
“You can see as we pull up, all of the sensors in the environment are visible. That kind of demonstrates what we’re able to do with this model,” he said.
VIRTUAL DRONE FLIGHT
Samei then moved on to programming an autonomous drone flight over one of the experimental farm fields that surround the Area X.O facility. For this portion of the demo, he utilized the Ansys STK toolkit – specifically designed for Digital Mission Engineering. You’ll recall Dr. Stoute spoke of this, and its ability to simulate entire systems – including ground stations, satellite communication, etc.
Samei defined the area of the field to be scanned, then “built” the quadcopter by selecting motors, battery, propellors – even the pitch of the blades.
“We end up with a very accurate model of a drone that reflects its actual performance,” he said.
He also programmed the altitude of the drone and the density of the scan – with passes over the field 400′ apart. With that and a few more clicks (all in real-time, which was pretty impressive to watch), he sent the drone off on its mission.
The virtual drone quickly scanned the desired area and returned to base with power to spare. Samei then plotted a more tightly focussed grid – lower altitude and more overlap, with grid passes 200′ apart – for greater data density. Then he send the quadcopter off again.
In this example, Samei was interested in whether the quadcopter could cover the scan with its existing power supply. He was also keen to learn if the ground station would be able to communicate with the drone throughout its mission. Both of these questions were answered in the affirmative without having to use a physical drone.
“We were able to verify the flight does not need more energy than the battery can provide,” he observed. “We can (also) see the minimum signal strength required – so indeed we are able to maintain consistent communication throughout the mission.”
That was impressive enough. But get this: The simulation software can even account for potential signal interference caused by buildings. And such flights – whether it’s a drone or a Cessna or a business jet – are not limited to Area X.O. Ansys STK has a database or pretty much anywhere on the planet.
“You can simulate your missions and flights over anywhere on earth,” said Samei.
Below: A screen capture during Samei Hossain’s real-time demo. Here, he’s configuring the technical parameters for a simulated quadcopter’s propulsion system
WAIT, THERE’S MORE
The real-time demo was impressive. But it left one wondering: What kind of a computer do you need to make these kind of simulations actually work? Surely the computational power required exceeds what most of us carry around on our laptop.
And that’s true. But the good news is, the Area X.O simulator portal includes access to the precise kind of computer required.
“What we’re providing with our simulation services is access to our computers,” said Samei.
“We have the workstations necessary that have the computational power, the memory, that’s able to simulate these problems very fast. So it’s not necessary for the clients to have a supercomputer in order to run the simulations. We can take that 10-day simulation time down to 10 hours.”
THE VIRTUAL ADANTAGE
If it wasn’t clear by now (and it surely was), the webinar wrapped with a reminder of why simulation is such a powerful and cost-effective tool for developers.
“We can do more different physics-based simulations such that you don’t have to build…expensive prototypes,” said Dr. Stoute. “People can actually imagine the wildest designs without any limitations. Having your wildest dreams imaginable.”
Engineer Hossain Samei also weighed in.
“One thing I really do believe in is: Knowledge is power,” he said.
“What simulation…lets us know (is) what’s going to happen and not suffer the consequences from actually having to make a product…and then find out: ‘Oops, I have a problem’. Simulation allows you to circumvent that and identify these issues before, where it’s easier to actually solve them.”
WANT TO TRY IT?
You can! Though the Area X.O simulation portal is ultimately a paid service, those interested in learning more can sign up for further free demos to get a better sense of what this resource is capable of delivering.
Sign up for free on this page.
If you thought you missed a cool demo, you did. But no worries, you can watch a replay of the entire webinar below:
The Ansys platform is acknowledged as the best simulation platform going. And with the expertise of Area X.O engineers Dr. Barry Stoute and Samei Hossain, we’re confident a solution can be tailored for pretty much any product operating in any environment.
“It’s a normal part of R&D to go through various iterations of products following real-world testing,” says InDro Robotics CEO Philip Reece. “And while products ultimately need to be tested in the real world prior to deployment, high-level simulation can save time, money – and mistakes.
“Even though our R&D hub is situated right at Area X.O, we plan on tapping into this powerful tool to analyze some of our products currently on the drawing board.”
If you’re interested in learning more about this new tool, drop Area X.O a line here.