Activity recognition (AR) systems for older adults are common in residential health care including hospitals or nursing homes; therefore, numerous solutions and studies presented to improve the performance of the AR systems. Yet, delivering sufficiently robust AR systems from sensor data recorded is a challenging task. AR in a smart environment utilizes large amounts of sensor data to derive effective features from the data to track the activity daily living. This paper maximizes the performance of AR system from using the convolutional neural network (CNN). Here, it analyzes signals from the network sensors distributed in different places in two clinical rooms at the Elizabeth hospital, such as W2ISP and RFID sensors. The proposed approach recognized the daily activities that consider a key to falling cases for older adults at a hospital or a nursing health house. A deep activity CNNets is used to train the effective features of daily activities sensors data then used for recognizing the highest falling risk activities in testing data. This approach used existing data of fourteen healthy older volunteers (ten females and four males) and then compared to other proposed approaches that used the same dataset. The experimental results show that this approach is superior to others. It achieved (96.37±3.63%) in the first clinic room and (98.37±1.63%) in the second clinic room. As the result, this experiment concludes that deep learning methodology is effectively assessing fall risk based on wearable sensors.