You are here

Project Update: Simulation

Project Update: Simulation of the Robot

May 9th 2019 -Noah Limes

Simulation is an invaluable tool in robotics development. With the alternative being a physical pool test, a well-designed simulator makes it possible to rapidly test algorithms, design robots, perform regression testing, and even train machine learning algorithms using realistic scenarios. ​Gazebo​ is a robust physics engine with high-quality graphics, is programmatic, and has graphical interfaces. The software team this year made it a priority to create and support a simulated competition vehicle.

The UUV Simulator​ is a package containing the implementation of Gazebo plugins and ROS nodes necessary for the simulation of unmanned underwater vehicles, such as ROVs (remotely operated vehicles) and AUVs (autonomous underwater vehicles). We started with this helpful plugin because of its relevance to our robot and a team members’ valuable previous experience with the package. It allowed us to rapidly set up a package structure which we could then “piggy-back” from, adding, adjusting, and deleting parts of the plugin when necessary.

When simulating, design specifics and real-world details are often sacrificed for performance. Due to this tradeoff, the first task was simplifying the CAD model of Maelstrom—an intricately detailed Solidworks model. At first, we tried to just delete most parts in Solidworks and export the model. We exported to Blender​, a popular open-source modelling and rendering software so we could quickly manipulate the scale, orientation, and origin of the simplified model. 

This workflow presented some issues. Importing Solidworks meshes into Blender means that whatever vertex amount Blender imports the model with is what you are stuck with. Vertex count and polygon count are an important factor for simulation and texturing. The more vertices, the more the complexity of the mesh and the more processing involved. In other words, wasted processing on aesthetics that could be spent on physics. This was why it was imperative to scale the model down to an appropriate size in the first place. But it also became clear that we could not simply import the Solidworks model into Blender. The import provided us with a mesh that was full of complicated triangle faces. This was a problem because we wanted to texture the model. We wanted texture for the model to add realism, aesthetics, and establish knowledge of the method so we could add texture to other objects in the simulation in the future.

Imagine a box made of folded paper. If you were to unfold this box and flatten the paper, you would have a 2D representation of all six of the box’s faces. In modelling, this is the idea behind a “​UV Map​”. UV Mappings are how we texture 3D objects with 2D images. UV Maps are used in video games and simulation all the time.

To “cut” Maelstrom into a UV Map, complicated and nonuniform triangles were not the way to go. Given a team member had previous experience in the Blender modelling software and that the simulation mesh was only for aesthetic purposes, we modelled a whole new, separate mesh in Blender. This allowed for the model to have organized, easy-to-work-with polygon faces. Then it was very easy to cut “​seams​” into the mesh into a flat 2D UV map and apply any texture we desired onto the UV map.

However, exporting textures within the model files from Blender into Gazebo is not as simple as expected. If we wanted to spawn models with any texture we wanted into our simulation, we learned that there was a very specific way of doing this. Currently, after pooling information from many different and conflicting tutorials and forums, the mesh spawns into the simulation but does not load the texture, despite no errors. More work is required.

More importantly, sensors were the next big thing to add to the simulation. To simulate our actual robot control code, Gazebo allows for sensors to be added onto the robot—all of the ones we use on the real vehicle will be simulated. The cameras, depth-sensors, IMU, etc. Luckily, UUV_Simulator has examples for how to include our sensors into their framework. In many cases, all that is needed is a simple ROS node to translate messages between our control code and the simulator messages.

Adding the sensors and solving the texture issues are the main things left to be done but the path forward is clear. The simulator will be a valuable tool in the software team’s arsenal and should give great value to future teams. In addition to all the benefits already discussed, we hope to show off the simulator at showcases and fairs as a very visual and user-friendly way to see the robot in action. Before you know it, the simulator will become a simu-NOW.