In recent years, researchers have focused on analyzing humans’ daily living activities to study various performance metrics that humans subconsciously optimize while performing a particular task. In order to recreate these motions in robotic structures based on the human model, researchers developed a framework for robot motion planning which is able to use various optimization methods to replicate similar motions demonstrated by humans. As part of this process, it will be necessary to record the motions data of the human body and the objects involved in order to provide all the essential information for motion planning. This paper aims to provide a dataset of human motion performing activities of daily living that consists of detailed and accurate human whole-body motion data collected using a Vicon motion capture system. The data have been utilized to generate a subject-specific full-body model within OpenSim. Additionally, it facilitated the computation of joint angles within the OpenSim framework, which can subsequently be applied to the subject-specific robotic model developed MATLAB framework. The dataset comprises nine daily living activities and eight Range of Motion activities performed by ten healthy participants and with two repetitions of each variation of one action, resulting in 340 demonstrations of all the actions. A whole-body human motion database is made available to the public at the Center for Assistive, Rehabilitation, and Robotics Technologies (CARRT)-Motion Capture Data for Robotic Human Upper Body Model, which consists of raw motion data in .c3d format, motion data in .trc format for the OpenSim model, as well as post-processed motion data for the MATLAB-based model.