Rigging SO-ARM 100 Robot in Unreal Engine - Complete Control Rig Tutorial
Category:
WORK
November 17, 2025
Complete step-by-step guide to rigging the SO-ARM 100 educational robot using Control Rig in Unreal Engine 5. From CAD files to animated physics.
Over the past months, I've had the privilege of working alongside Epic Games' exceptional team, talented instructors, and an amazing technical artist to master Control Rig. This tutorial documents the complete workflow for rigging the SO-ARM 100 educational robot entirely within Unreal Engine.
Prerequisites
Unreal Engine 5.5+ (with Control Rig enabled)
FreeCAD - Open source CAD software
SO-ARM 100 STEP files from manufacturer
Basic understanding of 3D modeling concepts
Phase 1: Converting CAD Models
STEP files (.step or .stp) are the industry standard for CAD data exchange. Engineers use these for manufacturing, rapid prototyping, and robotics applications. You can download the open source original files from TheRobotStudio Github.
Phase 2: Preparing the Mesh in Unreal
If you would like to jump right into Unreal just grab the glTF file and work with the Static Mesh and start from this file directly in Unreal to fix transform offsets and convert to skeletal mesh for rigging.
Phase 3: Skeleton and Weight Painting
Place bones at servo motor locations and paint weights for clean deformation.
Phase 4: Control Rig Implementation
Create modular Control Rig with physics dynamics for realistic robot behavior.
Special thanks to: Julie Lottering, Chase Cooper, Jeremie Passerin, Matt Ringot, Sara Schvartzman, Ferris Webby, Helge Mathee, Benoit Gaudreau, James Burton, Shenaz Baksh, Sean Spitzer, and Kevin Miller.
From CAD to Animation: Rigging the SO-ARM 100 Robot in Unreal Engine
My Journey with Control Rig

Over the past months, I've had the privilege of working alongside Epic Games' exceptional team, talented instructors, and an amazing technical artist to master one of Unreal Engine's most powerful features: Control Rig. What started as a technical challenge—rigging the SO-ARM 100 educational robot entirely within Unreal—became a transformative learning experience that fundamentally changed how I approach digital content creation.
The Tutorial: Complete Workflow from STEP Files to Animated Robot
Prerequisites
Unreal Engine 5.5+ (with Control Rig enabled)
FreeCAD (Download Free) - Open source CAD software
SO-ARM 100 STEP files - Original engineering files from the robot manufacturer
Basic understanding of 3D modeling concepts
Phase 1: Converting CAD Models to Unreal-Compatible Format
Step 1: Download the Original STEP Files
STEP files (.step or .stp) are the industry standard for CAD data exchange. Engineers use these files for manufacturing, rapid prototyping, and robotics applications. They contain precise geometric and assembly information that we'll leverage for our rig.
Why STEP files?
Parametric data preservation
Industry-standard format across all CAD platforms
Contains accurate assembly hierarchies
Maintains precise measurements critical for robotic applications
Step 2: Convert STEP to glTF Using FreeCAD
Download FreeCAD: https://www.freecad.org/downloads.php
Conversion Process:
Launch FreeCAD and open your STEP file:
File > Open→ Select your.stepfileFreeCAD will import the complete assembly
Verify the import:
Check that all parts are visible in the 3D viewport
Confirm the assembly hierarchy in the Model tree
Note any parts that may need repositioning
Export as glTF 2.0:
File > Export→ Choose "glTF 2.0 (*.gltf *.glb)"Select
.gltfformat (not.glb) for easier debuggingName it appropriately (e.g.,
SO-ARM-100.gltf)
Why glTF?
Native Unreal Engine import support
Preserves hierarchies and transforms
Open standard with excellent tool support
Maintains material assignments
Phase 2: Preparing the Mesh to work in Unreal Engine
Step 3: Import and Transform Correction
Import the glTF into Unreal:
Drag and drop the
.gltffile into your Content BrowserUnreal will create Static Mesh assets for each part, by default. You can drag all of the meshes selected into the scene and Combine them with modeling tools. Set option "combine mesh" in gLTF import settings to have it come in as one mesh. Its nice to have both imported to have a version of the robot separated and combined - for rigging , bone placement, weight painting.
Fix Transform Offsets:
STEP files often have arbitrary pivot points and rotations
Open each Static Mesh in the Static Mesh Editor
Use Modeling Mode tools to correct positioning:
Select the mesh
Use the Transform tool to zero out offsets
Apply transforms: Right-click mesh → Bake Transform
Position the Arm for Rigging then combine:
Arrange the arm from a contracted state to a relaxed/extended state
This makes bone placement more intuitive
Keep parts slightly separated for easier weight painting later
Pro Tip: I found it helpful to separate the meshes initially rather than combining them immediately. This separation made weight painting much more manageable, as I could work on each servo and arm segment independently.
Step 4: Convert to Skeletal Mesh


Select all your prepared Static Meshes
Right-click → Select Convert to Skeletal Mesh
Unreal will create a new Skeletal Mesh asset
This creates a basic skeleton structure we'll refine
Phase 3: Skeleton Creation and Weight Painting

Step 5: Manual Bone Placement

This is where precision matters. The SO-ARM 100 has servo motors at each joint—these are our rotation points.
My Strategy:
Lock the viewport to Left/Side orthographic view for precision
Use the "Place Bones Into Mesh" tool from the Skeleton Editor
Place bones only at servo motor locations where actual rotation occurs
Bone Naming Convention:
Bones imported as:
joint1,joint2,joint3, etc.These maintain the correct parent-child hierarchy automatically
Each bone represents a degree of freedom in the robot
Why this approach works:
Matches the physical robot's actual articulation points
Simplifies the rig (only necessary joints)
Makes animation intuitive (each bone = one motor)
Step 6: Weight Painting Strategy
This is where working in breaks with separated meshes paid off tremendously.
My Workflow:

Open Skeletal Mesh in Mesh Paint Mode
Work part-by-part systematically:
Select a bone (e.g.,
joint1)Paint the arm segment attached to that servo: White (value: 1.0)
Keep the servo housing itself: Black (value: 0.0)
Real-time testing:
Switch between Weight Paint Mode and Animation Mode
Rotate the bone you just painted
Check for vertex deformation issues
Any vertices that "stretch" or "mangle" need correction
Iterative refinement:
Paint → Test → Fix → Repeat
Focus on clean boundaries between rotating parts
Ensure no vertices are influenced by multiple bones unintentionally

Critical Insight: The real-time viewport feedback in Unreal made this process incredibly efficient compared to traditional DCCs. I could immediately see if vertices were caught in the wrong influence zone and fix them on the spot.

Phase 4: Control Rig Implementation
Step 7: Create Modular Control Rig
Finally, the moment we've been working toward! This is where Unreal's Control Rig system truly shines.
The Setup:
Right-click your Skeletal Mesh in the Content Browser
Select Create → Control Rig → Modular Control Rig
This opens the Modular Control Rig Editor
Step 8: Add Physics Dynamics (Chain Dynamics)
Here's where we make the digital robot feel like the physical one.
Implementation:
In the Modular Rig Hierarchy panel:
Locate Chain Dynamics module in the Content Browser
Drag it into your rig
Configure the dynamics:
The SO-ARM 100 has a natural slight wobble after movement = that's the way the robot actually behaves in real life!
Adjust Damping to control how quickly oscillation settles to your liking
Tweak Stiffness to match the servos' holding force to tweak it more
Set Mass based on the physical arm segments
Test in real-time:
Enable Live Preview
Animate a joint
Observe the physics simulation
The Result: With just a few clicks, you have a fully rigged robot ready for animation with realistic physics behavior!
Phase 5: Final Polish and Creative Freedom
Step 9: Expose Controls for Animators
The beauty of Control Rig is creating an animator-friendly interface:
Create custom controls for:
Individual joint rotation
IK/FK switching for the end effector
Preset poses (home position, extended, contracted)
Add visual gizmos for intuitive manipulation
Set up constraints that match the physical robot's limitations
Step 10: Real-time Visualization
With Control Rig, you can:
Plan and give Animators a way to Visualize robot paths before sending to hardware - Direct connection with Unreal using http TBA to drive control rig position!
Create training animations for educational content
Test motion planning in virtual environments
Drive the rig from external data to combine robot telemetry, motion capture, etc.
The Learning Process with a group of riggers: Iteration as a way of fixing errors
I've discovered that redoing projects in Unreal always works out in the long run. There's a profound rhythm to it:
Observe - Watch professionals demonstrate the workflow
Attempt - Try it yourself, making inevitable mistakes
Execute - Actually complete it, learning from errors
Refine - Redo it properly with deeper understanding
Each iteration revealed new layers. Control Rig isn't just about making things move—it's about building intelligent, procedural systems that respect the underlying mechanics of what you're animating.
The Hardships and Breakthroughs
Challenges I Faced:
Learning curve: Node-based rigging felt overwhelming initially
Transform hell: Fighting with offset pivots from CAD imports
Weight painting precision: Getting clean deformation at servo boundaries
Constraint chains: Debugging complex hierarchies at 2 AM - I had to redo the whole weight painting because I did not make a copy . Make Copies - Backups many times they can be a way to debug your own work better. I made a mistake of missing this crucial step and because of time constrains I did not clean up the mesh 100%.
Start learning with low poly cad models exports then move on to more complex intensive task. Unreal can handle a lot what you throw at it in terms of mesh count and vertices but keeping things sane is always better for the UI responsiveness.
Breakthroughs That Made It Worth It:
Systematic thinking: Control Rig forces you to understand transforms mathematically
Reusable modules: Building rig components that work across projects
Real-time feedback: Seeing results instantly, not after render/bake cycles
Unified pipeline: Rigging, animating, and rendering in one environment
Gratitude: The Team That Made This Possible
None of this would have been achievable without extraordinary support:
The Epic Games Team: They showed up for nearly every question, every roadblock, every "is this even possible?" moment. Their commitment to developer success is genuinely remarkable.
The Instructors: Who taught not just techniques, but ways of thinking about rigging problems. They shared professional workflows and industry wisdom you can only gain from years in production.
The Amazing TA: Patient, knowledgeable, and always willing to dig into the weirdest edge cases. Technical Artists are the unsung heroes of any pipeline.
Fellow Riggers: Learning alongside other professionals pursuing Control Rig mastery created a collaborative environment where we could share discoveries and push each other forward.
Why This Matters Beyond One Robot
The SO-ARM 100 project represents proof that Unreal Engine can serve as a complete content creation environment, not just a rendering engine. For projects demanding tight integration between:
Animation
Physics simulation
Real-time interaction
Educational visualization
...being able to rig entirely within Unreal eliminates entire categories of pipeline friction.
For my work in medical education and AR/VR training, this workflow opens incredible possibilities:
Rigging complex medical devices
Surgical robots for training simulations
Anatomical models with procedural controls
Interactive educational experiences
The Takeaway
CAD models into Control Rig? It geeked me out like playtime, not work.
Unreal's UI workflow? Delivered results that exceeded traditional DCCs.
From bone structure to weight maps? Streamlined and intuitive.
The SO-ARM 100 now moves with precision and purpose—100% rigged in Unreal Engine with Control Rig. And that feels like a victory worth celebrating.
What's your experience with Control Rig? Have you tackled mechanical rigging challenges in Unreal? Let's discuss in the comments—I'd love to hear about your journey!
Resources:
FreeCAD Download: https://www.freecad.org/downloads.php
Unreal Engine Control Rig Documentation: [Epic Docs]
SO-ARM 100 Project Files: [Your repository/link]
#UnrealEngine #ControlRig #Robotics #TechnicalArt #MedicalEducation #ARVR #GameDev #3DAnimation
From CAD to Animation: A Complete Journey
Over the past months, I've had the privilege of working alongside Epic Games' exceptional team, talented instructors, and an amazing technical artist to master Control Rig. This tutorial documents the complete workflow for rigging the SO-ARM 100 educational robot entirely within Unreal Engine.
Prerequisites
• Unreal Engine 5.5+ (with Control Rig enabled)
• FreeCAD - Open source CAD software
• SO-ARM 100 STEP files from manufacturer
• Basic understanding of 3D modeling concepts
Phase 1: Converting CAD Models
Step 1: Download STEP Files
STEP files (.step or .stp) are the industry standard for CAD data exchange. Engineers use these for manufacturing, rapid prototyping, and robotics applications.
Why STEP files?
• Parametric data preservation
• Industry-standard format
• Contains accurate assembly hierarchies
• Maintains precise measurements
[SCREENSHOT: SO-ARM 100 STEP files]
Step 2: Convert STEP to glTF Using FreeCAD
Download FreeCAD: https://www.freecad.org/downloads.php
Conversion Process:
1. Launch FreeCAD and open your STEP file (File > Open)
2. Verify the import - check all parts are visible
3. Export as glTF 2.0 (File > Export > glTF 2.0)
4. Choose .gltf format (not .glb) for easier debugging
[SCREENSHOT: FreeCAD import interface]
Why glTF? Native Unreal support, preserves hierarchies, open standard, maintains materials.
Phase 2: Preparing the Mesh in Unreal
Step 3: Import and Transform Correction
1. Import the glTF: Drag and drop into Content Browser
2. Fix Transform Offsets: STEP files often have arbitrary pivot points
3. Use Modeling Mode tools to correct positioning
4. Apply transforms: Right-click mesh → Bake Transform
[SCREENSHOT: Unreal Modeling Mode]
Position the arm from contracted to relaxed/extended state for easier bone placement.
Step 4: Convert to Skeletal Mesh
1. Select all prepared Static Meshes
2. Right-click → Convert to Skeletal Mesh
3. Unreal creates a basic skeleton structure
[SCREENSHOT: Convert to Skeletal Mesh option]
Phase 3: Skeleton and Weight Painting
Step 5: Manual Bone Placement
This is where precision matters. Place bones at servo motor locations.
My Strategy:
• Lock viewport to Left/Side orthographic view
• Use "Place Bones Into Mesh" tool
• Place bones ONLY at servo motor locations
• Bones import as: joint1, joint2, joint3, etc.
[SCREENSHOT: Bone placement in orthographic view]
Step 6: Weight Painting Strategy
My Workflow:
1. Open Skeletal Mesh in Mesh Paint Mode
2. Work part-by-part: Select bone, paint arm segment WHITE (1.0)
3. Keep servo housing BLACK (0.0)
4. Real-time testing: Switch between Paint and Animation modes
5. Rotate bones to check for vertex issues
6. Iterate: Paint → Test → Fix → Repeat
[SCREENSHOT: Weight painting interface]
Critical: Real-time viewport feedback made this incredibly efficient!
Phase 4: Control Rig Implementation
Step 7: Create Modular Control Rig
The moment we've been working toward!
1. Right-click Skeletal Mesh in Content Browser
2. Select: Create → Control Rig → Modular Control Rig
3. Opens Modular Control Rig Editor
[SCREENSHOT: Create Control Rig menu]
Step 8: Add Physics Dynamics
Make the digital robot feel like the physical one!
1. In Modular Rig Hierarchy, locate Chain Dynamics module
2. Drag it into your rig
3. Configure dynamics:
• Adjust Damping for oscillation settling
• Tweak Stiffness to match servo holding force
• Set Mass based on physical arm segments
4. Enable Live Preview and test
[SCREENSHOT: Chain Dynamics settings]
Result: Fully rigged robot with realistic physics in just a few clicks!
The Learning Process: Iteration as Meditation
I've discovered that redoing projects in Unreal is like meditation. There's a rhythm:
1. Observe - Watch professionals demonstrate
2. Attempt - Try it yourself
3. Execute - Complete it, learning from errors
4. Refine - Redo it properly with understanding
Thank You
Massive thanks to the Epic Games Team, incredible instructors, amazing TA, and fellow riggers who made this journey possible.
Special thanks to: Julie Lottering, Chase Cooper, Jeremie Passerin, Matt Ringot, Sara Schvartzman, Ferris Webby, Helge Mathee, Benoit Gaudreau, James Burton, Shenaz Baksh, Sean Spitzer, and Kevin Miller.
Why This Matters
This workflow proves Unreal can be a complete content creation environment. For my work in medical education and AR/VR training with POCUS technology, rigging complex medical devices entirely in Unreal eliminates pipeline friction.
The SO-ARM 100 now moves with precision and purpose—100% rigged in Unreal Engine.
Your Turn!
Have you explored Control Rig for mechanical rigging? Share your experiences in the comments!
