In everyday life, we frequently have to decide which hand to use for a certain action. It has been suggested that for this decision the brain calculates expected costs based on action values, such as expected biomechanical costs, expected success rate, handedness, and skillfulness. Although these conclusions were based on experiments in stationary subjects, we often act while the body is in motion. We investigated how hand choice is affected by passive body motion, which directly affects the biomechanical costs of the arm movement due to its inertia. With the use of a linear motion platform, 12 right-handed subjects were sinusoidally translated (0.625 and 0.5 Hz). At 8 possible motion phases, they had to reach, using either their left or right hand, to a target presented at 1 of 11 possible locations. We predicted hand choice by calculating the expected biomechanical costs under different assumptions about the future acceleration involved in these computations, being the forthcoming acceleration during the reach, the instantaneous acceleration at target onset, or zero acceleration as if the body were stationary. Although hand choice was generally biased to use of the dominant hand, it also modulated sinusoidally with the motion, with the amplitude of the bias depending on the motion's peak acceleration. The phase of hand choice modulation was consistent with the cost model that took the instantaneous acceleration signal at target onset. This suggests that the brain relies on the bottom-up acceleration signals, and not on predictions about future accelerations, when deciding on hand choice during passive whole body motion. Decisions of hand choice are a fundamental aspect of human behavior. Whereas these decisions are typically studied in stationary subjects, this study examines hand choice while subjects are in motion. We show that accelerations of the body, which differentially modulate the biomechanical costs of left and right hand movements, are also taken into account when deciding which hand to use for a reach, possibly based on bottom-up processing of the otolith signal.