This thesis enhances the autonomy of the M4 (Multi-Modal Mobility Morphobot) robot, designed for Mars and rescue missions. The research enables the robot to autonomously select its locomotion mode and path in complex terrains. Focusing on walking and flying modes, a Gazebo simulation and custom perception-navigations pipelines are developed. Leveraging deep learning, the robot determines optimal mode transitions based on a 2.5D map. Additionally, an energyefficient path planner based on 2.5D mapping is implemented and validated in simulations. The contributions demonstrate scalability for future mode integrations. The M4 robot showcases intelligent mode switching, efficient navigation, and reduced energy consumption, bringing us closer to fully autonomous multi-modal robots for exploration and rescue missions. This work paves the way for future advancements in autonomous robotics, with the ultimate vision of deploying the M4 robot for exploration and rescue tasks, making a significant impact in the quest for intelligent and versatile robotic systems. vii