Skip to main content

Weeks 6-7: Robot Simulation with Gazebo and Unity

Learn to simulate robots in realistic virtual environments before deploying to physical hardware. Simulation accelerates development, reduces costs, and enables safe testing of dangerous scenarios.

Why Robot Simulation?

Benefits of Simulation

Cost-Effective Development:
Test algorithms without expensive hardware. Simulate thousands of iterations in parallel without wear and tear on physical robots.

Safe Experimentation:
Test dangerous scenarios (falls, collisions, extreme conditions) without risk to hardware or people. Perfect for developing safety-critical systems.

Rapid Prototyping:
Iterate on design quickly by modifying robot models in software. Test different sensor configurations, actuator capabilities, and mechanical designs.

Parallel Development:
Develop and test software before hardware is available. Multiple team members can work simultaneously without sharing physical robots.

Simulation Limitations

Reality Gap:
Simulated physics never perfectly match real-world behavior. Friction, material properties, sensor noise, and complex dynamics are approximations.

Sensor Accuracy:
Simulated sensors (LiDAR, cameras) may not capture real-world imperfections like lens distortion, motion blur, or lighting variations.

Computational Cost:
High-fidelity physics simulation requires significant computing power, especially for complex robots with many degrees of freedom.

Gazebo Simulation Environment

Gazebo is the industry-standard 3D robot simulator integrated with ROS 2. It provides high-fidelity physics simulation, sensor models, and realistic rendering.

Key Features

  • Physics Engines: Multiple engines supported (ODE, Bullet, Simbody, DART)
  • Sensor Simulation: LiDAR, cameras (RGB/depth), IMU, GPS, force/torque
  • ROS 2 Integration: Seamless communication via ros_gz_bridge
  • Plugin System: Extend functionality with custom C++ or Python plugins
  • Worlds & Models: Pre-built environments and robot models

Installation (Ubuntu 22.04)

# Install Gazebo Fortress (recommended for ROS 2 Humble)
sudo apt-get update
sudo apt-get install ros-humble-ros-gz

# Verify installation
gz sim --version

# Install additional tools
sudo apt-get install ros-humble-gazebo-ros-pkgs

Launching Gazebo with ROS 2

# Launch Gazebo with an empty world
ros2 launch gazebo_ros gazebo.launch.py

# Launch with a specific world file
ros2 launch gazebo_ros gazebo.launch.py world:=./my_world.sdf

# Spawn a robot model
ros2 run gazebo_ros spawn_entity.py -entity my_robot \
-file /path/to/robot.urdf -x 0.0 -y 0.0 -z 1.0

Robot Description Formats

URDF (Unified Robot Description Format)

URDF is an XML-based format for describing robot kinematics, dynamics, and visual appearance. It's the standard format for ROS robots.

Basic URDF Structure

<?xml version="1.0"?>
<robot name="simple_robot">

<!-- Base Link -->
<link name="base_link">
<visual>
<geometry>
<box size="0.5 0.3 0.1"/>
</geometry>
<material name="blue">
<color rgba="0 0 0.8 1"/>
</material>
</visual>
<collision>
<geometry>
<box size="0.5 0.3 0.1"/>
</geometry>
</collision>
<inertial>
<mass value="5.0"/>
<inertia ixx="0.1" ixy="0.0" ixz="0.0"
iyy="0.1" iyz="0.0" izz="0.1"/>
</inertial>
</link>

<!-- Wheel Link -->
<link name="left_wheel">
<visual>
<geometry>
<cylinder radius="0.1" length="0.05"/>
</geometry>
<material name="black">
<color rgba="0.1 0.1 0.1 1"/>
</material>
</visual>
<collision>
<geometry>
<cylinder radius="0.1" length="0.05"/>
</geometry>
</collision>
<inertial>
<mass value="1.0"/>
<inertia ixx="0.01" ixy="0.0" ixz="0.0"
iyy="0.01" iyz="0.0" izz="0.01"/>
</inertial>
</link>

<!-- Joint connecting base to wheel -->
<joint name="left_wheel_joint" type="continuous">
<parent link="base_link"/>
<child link="left_wheel"/>
<origin xyz="0.2 0.2 -0.05" rpy="0 0 0"/>
<axis xyz="0 1 0"/>
</joint>

</robot>

URDF Components Explained

Links:
Physical components of the robot (chassis, wheels, arms, sensors)

  • <visual>: How the link appears visually (mesh, color, geometry)
  • <collision>: Simplified geometry for collision detection
  • <inertial>: Mass and inertia tensor for physics simulation

Joints:
Connections between links that define motion constraints

  • fixed: No movement (e.g., camera mounted on chassis)
  • revolute: Rotation with limits (e.g., elbow joint)
  • continuous: Unlimited rotation (e.g., wheel)
  • prismatic: Linear sliding motion (e.g., elevator)

Coordinate Frames:
Each link has its own coordinate frame. Joints define transformations between parent and child frames using <origin> tag.

URDF with Xacro (Macro Language)

Xacro extends URDF with macros, variables, and mathematical expressions to reduce repetition.

<?xml version="1.0"?>
<robot xmlns:xacro="http://www.ros.org/wiki/xacro" name="robot">

<!-- Define constants -->
<xacro:property name="wheel_radius" value="0.1"/>
<xacro:property name="wheel_width" value="0.05"/>
<xacro:property name="wheel_mass" value="1.0"/>

<!-- Wheel macro -->
<xacro:macro name="wheel" params="prefix x_pos y_pos">
<link name="${prefix}_wheel">
<visual>
<geometry>
<cylinder radius="${wheel_radius}" length="${wheel_width}"/>
</geometry>
<material name="black">
<color rgba="0.1 0.1 0.1 1"/>
</material>
</visual>
<collision>
<geometry>
<cylinder radius="${wheel_radius}" length="${wheel_width}"/>
</geometry>
</collision>
<inertial>
<mass value="${wheel_mass}"/>
<inertia ixx="${wheel_mass * wheel_radius * wheel_radius / 4}"
iyy="${wheel_mass * wheel_radius * wheel_radius / 4}"
izz="${wheel_mass * wheel_radius * wheel_radius / 2}"
ixy="0" ixz="0" iyz="0"/>
</inertial>
</link>

<joint name="${prefix}_wheel_joint" type="continuous">
<parent link="base_link"/>
<child link="${prefix}_wheel"/>
<origin xyz="${x_pos} ${y_pos} 0" rpy="${-pi/2} 0 0"/>
<axis xyz="0 0 1"/>
</joint>
</xacro:macro>

<!-- Instantiate wheels -->
<xacro:wheel prefix="front_left" x_pos="0.2" y_pos="0.15"/>
<xacro:wheel prefix="front_right" x_pos="0.2" y_pos="-0.15"/>
<xacro:wheel prefix="rear_left" x_pos="-0.2" y_pos="0.15"/>
<xacro:wheel prefix="rear_right" x_pos="-0.2" y_pos="-0.15"/>

</robot>

SDF (Simulation Description Format)

SDF is Gazebo's native format, more feature-rich than URDF for simulation-specific parameters.

Key Advantages of SDF

  • Worlds and Models: Can describe entire environments, not just robots
  • Plugins: Native support for sensor and actuator plugins
  • Physics Parameters: More detailed control over friction, damping, contact properties
  • Nested Models: Compose complex scenes from reusable sub-models
  • Versioning: Backward compatibility through version tags

SDF Example

<?xml version="1.0"?>
<sdf version="1.9">
<model name="mobile_robot">

<link name="chassis">
<pose>0 0 0.1 0 0 0</pose>
<inertial>
<mass>10.0</mass>
<inertia>
<ixx>0.5</ixx>
<iyy>0.5</iyy>
<izz>0.8</izz>
</inertia>
</inertial>

<collision name="collision">
<geometry>
<box>
<size>0.6 0.4 0.2</size>
</box>
</geometry>
</collision>

<visual name="visual">
<geometry>
<box>
<size>0.6 0.4 0.2</size>
</box>
</geometry>
<material>
<ambient>0.2 0.2 0.8 1</ambient>
<diffuse>0.2 0.2 0.8 1</diffuse>
</material>
</visual>

<!-- LiDAR Sensor -->
<sensor name="lidar" type="gpu_lidar">
<pose>0 0 0.15 0 0 0</pose>
<update_rate>10</update_rate>
<lidar>
<scan>
<horizontal>
<samples>640</samples>
<resolution>1</resolution>
<min_angle>-3.14159</min_angle>
<max_angle>3.14159</max_angle>
</horizontal>
</scan>
<range>
<min>0.1</min>
<max>30.0</max>
<resolution>0.01</resolution>
</range>
</lidar>
<always_on>true</always_on>
<visualize>true</visualize>
</sensor>

</link>

<!-- Differential Drive Plugin -->
<plugin name="diff_drive"
filename="gz-sim-diff-drive-system">
<left_joint>left_wheel_joint</left_joint>
<right_joint>right_wheel_joint</right_joint>
<wheel_separation>0.4</wheel_separation>
<wheel_radius>0.1</wheel_radius>
<topic>/cmd_vel</topic>
</plugin>

</model>
</sdf>

Physics Simulation

Physics Engine Selection

Gazebo supports multiple physics engines with different trade-offs:

ODE (Open Dynamics Engine)

  • Pros: Fast, stable, default in Gazebo
  • Cons: Less accurate for complex contacts
  • Best for: Mobile robots, simple manipulators

Bullet

  • Pros: Good collision detection, popular in games
  • Cons: Can be unstable with high forces
  • Best for: Multi-contact scenarios

DART

  • Pros: Highly accurate, great for humanoids
  • Cons: Slower than ODE
  • Best for: Legged robots, biomechanical simulations

Physics Parameters

<world name="default">
<physics type="ode">
<max_step_size>0.001</max_step_size>
<real_time_factor>1.0</real_time_factor>
<real_time_update_rate>1000</real_time_update_rate>

<ode>
<solver>
<type>quick</type>
<iters>50</iters>
<sor>1.0</sor>
</solver>
<constraints>
<cfm>0.0</cfm>
<erp>0.2</erp>
<contact_max_correcting_vel>100.0</contact_max_correcting_vel>
<contact_surface_layer>0.001</contact_surface_layer>
</constraints>
</ode>
</physics>
</world>

Key Parameters:

  • max_step_size: Time step for simulation (smaller = more accurate, slower)
  • real_time_factor: Target speed relative to real time (1.0 = real-time)
  • cfm: Constraint Force Mixing (adds springiness to constraints)
  • erp: Error Reduction Parameter (how fast to correct constraint violations)

Sensor Simulation

Simulated LiDAR

import rclpy
from rclpy.node import Node
from sensor_msgs.msg import LaserScan

class LidarListener(Node):
def __init__(self):
super().__init__('lidar_listener')
self.subscription = self.create_subscription(
LaserScan, '/scan', self.scan_callback, 10)

def scan_callback(self, msg):
# Process LiDAR data
ranges = msg.ranges
min_distance = min([r for r in ranges if r > msg.range_min])

self.get_logger().info(f'Closest obstacle: {min_distance:.2f}m')

# Detect obstacles in front (±15 degrees)
front_ranges = ranges[-15:] + ranges[:15]
front_min = min([r for r in front_ranges if r > msg.range_min])

if front_min < 0.5:
self.get_logger().warn('Obstacle detected ahead!')

Simulated Camera

from sensor_msgs.msg import Image
from cv_bridge import CvBridge
import cv2

class CameraProcessor(Node):
def __init__(self):
super().__init__('camera_processor')
self.bridge = CvBridge()
self.subscription = self.create_subscription(
Image, '/camera/image_raw', self.image_callback, 10)

def image_callback(self, msg):
# Convert ROS Image to OpenCV format
cv_image = self.bridge.imgmsg_to_cv2(msg, desired_encoding='bgr8')

# Process image (e.g., edge detection)
edges = cv2.Canny(cv_image, 100, 200)

# Display
cv2.imshow('Camera Feed', cv_image)
cv2.imshow('Edges', edges)
cv2.waitKey(1)

Unity for Robot Visualization

Unity provides high-quality rendering and is excellent for creating training environments for reinforcement learning and demonstrating robots in realistic scenes.

ROS 2 Unity Integration

Unity Robotics Hub:
Official toolkit from Unity for ROS integration, providing:

  • URDF importer for robot models
  • ROS 2 message serialization/deserialization
  • Simulation control from ROS 2

Installation

# Install Unity ROS-TCP-Endpoint package
# In Unity Package Manager, add from git URL:
# https://github.com/Unity-Technologies/ROS-TCP-Endpoint.git

# Install ROS 2 side
sudo apt-get install ros-humble-rosbridge-suite

Unity Simulation Setup

using UnityEngine;
using Unity.Robotics.ROSTCPConnector;
using RosMessageTypes.Geometry;

public class RobotController : MonoBehaviour
{
private ROSConnection ros;

void Start()
{
// Connect to ROS
ros = ROSConnection.GetOrCreateInstance();
ros.RegisterPublisher<TwistMsg>("/cmd_vel");

// Subscribe to robot pose
ros.Subscribe<PoseStampedMsg>("/robot_pose", UpdatePose);
}

void Update()
{
// Send velocity commands
if (Input.GetKey(KeyCode.W))
{
var twist = new TwistMsg();
twist.linear.x = 1.0;
ros.Publish("/cmd_vel", twist);
}
}

void UpdatePose(PoseStampedMsg msg)
{
// Update Unity transform from ROS pose
transform.position = new Vector3(
(float)msg.pose.position.x,
(float)msg.pose.position.y,
(float)msg.pose.position.z
);
}
}

Use Cases for Unity

Reinforcement Learning:
Unity ML-Agents provides high-performance parallel simulation for training policies. Run thousands of robot instances simultaneously.

Digital Twins:
Create photorealistic representations of real robots and environments for monitoring and visualization.

Human-Robot Interaction:
Design and test user interfaces, AR/VR applications, and human-centered robotics.

Synthetic Data Generation:
Generate labeled training data for computer vision (segmentation masks, bounding boxes, depth maps) with perfect ground truth.

Practical Exercises

Week 6 Assignment: Gazebo Robot Simulation

Objective: Create and simulate a differential drive robot in Gazebo

Tasks:

  1. Write a URDF model for a mobile robot with:

    • Rectangular chassis
    • Two drive wheels
    • One caster wheel
    • Simulated LiDAR sensor
    • RGB camera
  2. Add Gazebo plugins for:

    • Differential drive controller
    • Sensor data publishing
  3. Create a launch file that:

    • Spawns your robot in Gazebo
    • Launches RViz for visualization
    • Starts teleoperation node
  4. Test obstacle avoidance by subscribing to LiDAR data and publishing velocity commands

Deliverable:
Complete ROS 2 package with URDF, launch files, and obstacle avoidance node. Record a video demonstration.

Week 7 Assignment: Advanced Simulation

Objective: Compare Gazebo and Unity simulation

Tasks:

  1. Create a custom Gazebo world with obstacles
  2. Implement the same robot in Unity
  3. Test identical navigation scenarios in both simulators
  4. Document differences in performance, visual quality, and ease of use

Deliverable:
Comparative analysis report with screenshots, performance metrics, and recommendations for different use cases.


Next: In Weeks 8-10, we'll explore humanoid robot kinematics and motion planning algorithms.