This paper proposes a novel brain-machine interfacing (BMI) paradigm for control of a multijoint redundant robot system. Here, the user would determine the direction of end-point movement of a 3-degrees of freedom (DOF) robot arm using motor imagery electroencephalography signal with coadaptive decoder (adaptivity between the user and the decoder) while a synergetic motor learning algorithm manages a peripheral redundancy in multi-DOF joints toward energy optimality through tacit learning. As in human motor control, torque control paradigm is employed for a robot to be adaptive to the given physical environment. The dynamic condition of the robot arm is taken into consideration by the learning algorithm. Thus, the user needs to only think about the end-point movement of the robot arm, which allows simultaneous multijoints control by BMI. The support vector machine-based decoder designed in this paper is adaptive to the changing mental state of the user. Online experiments reveals that the users successfully reach their targets with an average decoder accuracy of over 75% in different end-point load conditions. Index Terms-Brain-machine interfacing (BMI), co-adaptive decoder, joint redundancy, multijoint robot, synergetic learning control, tacit learning. I. INTRODUCTION A S OF today, brain-machine interfacing (BMI) [or braincomputer interfacing (BCI)] is one of the fastest growing areas of research that provides a unique course of communication between a human and a machine (or device) without any neuro-muscular intervention [1]. BMI was initially conceived to provide rehabilitative and assistive solutions [2], [3] to patients suffering from neuromuscular degenerative diseases, such as amyotropic lateral sclerosis, cervical spinal injury, paralysis, or amputee [4].