Simulated environments can be a quicker and more flexible alternative to training and testing machine learning models in the real world. Models also need to be able to efficiently communicate with the environment. In military-relevant environments, a trained model can play a valuable role in finding cover for an autonomous robot to prevent getting detected or attacked by adversaries. In this regard, we present a forest simulation and robot control framework that is ready for integration with machine learning or object recognition algorithms. Our framework includes an environment relevant to military situations and is capable of providing information about the environment to a machine learning model. A forest environment was designed with wooded areas, open paths, water, and bridges. A Clearpath Husky robot is simulated in the environment using Army Research Laboratory's (ARL) Unity and ROS simulation framework. The Husky robot is equipped with a camera and lidar sensor. Data from these sensors can be read through ROS topics and RViz configuration windows. The robot can be moved using ROS velocity command topics. These communication methods can be employed by a machine learning algorithm for use in detecting trees to attain maximum cover. Our designed environment improves upon the default ARL framework environments by offering a more diverse terrain and more opportunities for cover. This makes the environment more relevant to a cover-seeking machine learning model. Code, videos, and integration process available at : https://github.com/avispector7/Forest-Simulation