In this paper we further investigate the relationship, reported by Oates et al. (2012), between the optical/UV afterglow luminosity (measured at restframe 200 s) and average afterglow decay rate (measured from restframe 200 s onwards) of long duration Gamma-ray Bursts (GRBs). We extend the analysis by examining the X-ray light curves, finding a consistent correlation. We therefore explore how the parameters of these correlations relate to the prompt emission phase and, using a Monte Carlo simulation, explore whether these correlations are consistent with predictions of the standard afterglow model. We find significant correlations between: log L O,200s and log L X,200s ; α O,>200s and α X,>200s , consistent with simulations. The model also predicts relationships between log E iso and log L 200,s , however, while we find such relationships in the observed sample, the slope of the linear regression is shallower than that simulated and inconsistent at 3σ. Simulations also do not agree with correlations observed between log L 200s and α >200s , or log E iso and α >200s . Overall, these observed correlations are consistent with a common underlying physical mechanism producing GRBs and their afterglows regardless of their detailed temporal behaviour. However, a basic afterglow model has difficulty explaining all the observed correlations. This leads us to briefly discuss alternative more complex models.