Human activity wearable obstacle detection for the visually impaired (VI) is developed for routine monitoring and observation of surrounding events. Environmental observation, home surveillance, and assistive supports are now built on wearable devices using Inertia-based sensors, such as accelerometers, linear acceleration, and gyroscopes. Previous assisted living system (ALS) still faces challenges in energy management and resource allocation when performing daily activities, particularly with ambulation. Legacy systems cannot fully improve self-esteem, hence, WearROBOT. It detects rearview obstacles and has an audio feedback system incorporated for voicing out once an obstacle is detected. Linear programing (LP) multi-commodity graph (LMCG) learning model is proposed while coupling the shortest path resource allocation for space diversity linearization. An Infrared sensor problem function that minimizes link utilization is derived. Angle-Intensity analysis (AIA) is carried out on various use case scenarios to enable the user to know the best angle to consider depending on its usage and battery conservation. The work showed how intensity differs at various angles at 5°,15°,20° 35°, and 45° respectively. Also, the reflectivity of different materials and how it affects the battery life are studied. As the wearable robot moves away from the node-obstacle, the LMCG narrow-band sensor node (LMCG-NB-IoT) drops energy significantly. The Low Power WAN (LP-WAN), Bluetooth Low-Energy (BLE) and proposed LMCG-NB-IoT offered 51.28%, 33.33%, and 15.39% respectively. In terms of energy latency, the schemes gave 65.63%, 31.25%, and 3.12% respectively. Similarly, the proposed LMCG-NB-IoT had a 50% battery life profile. Finally, WearROBOT mobility aid minimizes injuries experienced by the visually impaired.