We investigate a novel characteristic of the conjugate function associated to a generic convex optimization problem which can subsequently be leveraged for efficient dual decomposition methods. In particular, under mild assumptions, we show a specific region of the domain of the conjugate function where there is always a ray originating from any point of the region such that the gradients of the conjugate remain constant along the ray. We refer to this characteristic as a fixed gradient over rays (FGOR). We further show that the dual function inherits from the conjugate the characteristic of FGOR. Then we provide a thorough exposition of applying the FGOR characteristic to dual subgradient methods. More importantly, we leverage FGOR to devise a simple stepsize rule that can be prepended with stateof-the-art stepsize methods enabling them to be more efficient. Furthermore, we investigate how the characteristic of FGOR is used when solving the global consensus problem. We show that FGOR can be exploited not only to expedite the convergence of the dual decomposition methods but also to reduce the communication overhead when distributed implementations are sought. Numerical experiments using quadratic objectives and a regularized linear regression with a real data set are conducted to compare the practical performance of FGOR with state-of-the-art stepsize methods. Results show that the proposed approach can significantly improve the convergence of existing methods while saving a considerable amount of communication overhead.