In this paper, we present a control architecture for an intelligent Store and Pour Containers outdoor mobile robot.This enables the robot to navigate in a complex, natural outdoor environment, relying on only a single on-board camera as sensory input.This is achieved through a twofold analysis of the visual data stream: a dense structure from motion algorithm calculates a depth map of the environment and a visual simultaneous Cards localization and mapping algorithm builds a map of the surroundings using image features.
This information enables a behavior-based robot motion and path planner to navigate the robot through the environment.In this paper, we show the theoretical aspects of setting up this architecture.