Abstract-We present TRUSS, or Tracking Risk with Ubiquitous Smart Sensing, a novel system that infers and renders safety context on construction sites by fusing data from wearable devices, distributed sensing infrastructure, and video. Wearables stream real-time levels of dangerous gases, dust, noise, light quality, altitude, and motion to base stations that synchronize the mobile devices, monitor the environment, and capture video. At the same time, low-power video collection and processing nodes track the workers as they move through the view of the cameras, identifying the tracks using information from the sensors. These processes together connect the context-mining wearable sensors to the video; information derived from the sensor data is used to highlight salient elements in the video stream. The augmented stream in turn provides users with better understanding of real-time risks, and supports informed decision-making. We tested our system in an initial deployment on an active construction site.