Splat Raycasting for Robot Locomotion
XR developer and artist RubenFro built a walking robot that navigates directly through World Labs generated splat environments — without meshes, navmeshes, or baked colliders.
The Experiment
Instead of relying on traditional geometry, the robot reads raw splat data to understand where it can move.
- Probes terrain directly from splats
- Adapts movement in real time
- Climbs and traverses uneven surfaces
The result feels less like a game simulation and more like a machine sensing a real place.
How It Works
Inside Unity, custom raycasting systems decode splat position, scale, and opacity to determine where the robot can step.
Nearby rays estimate surface angle and stability, driving locomotion, balance, climbing, and wall walking directly from splat data.
Stack
- Marble / World Labs for splat generation
- Unity for simulation and rendering
- Custom splat raycasting + procedural locomotion
- Analytic IK for adaptive leg movement
What’s Next
- Autonomous rover-style navigation
- Simulated LiDAR and depth sensing
- Exploration systems that interpret splat worlds like real terrain
Additional videos and development posts available on X, and 80 Level,
More showcases
December 2025
Splat Collider Builder Tool
Lightweight collision tooling for interactive Gaussian splat scenes..

January 2026
Concept to Splat in Unreal Engine
Bringing AI-generated worlds into real-time production workflows.

February 2026
Dormant Memories
Moving between real spaces and generated alternate realities..
