The real option management of commodity conversion assets gives rise to intractable Markov decision processes (MDPs). This is due primarily to the high dimensionality of a commodity forward curve, which is part of the MDP state when using high dimensional models of the evolution of this curve, as commonly done in practice. Focusing on commodity storage, we develop a novel approximate dynamic programming methodology that hinges on the relaxation of approximate linear programs (ALPs) obtained using value function approximations based on reducing the number of futures prices that are part of the MDP state. We derive equivalent approximate dynamic programs (ADPs) for a class of these ALPs, also subsuming a known ADP. We obtain two new ADPs, the value functions of which induce feasible policies for the original MDP, and lower and upper bounds, estimated via Monte Carlo simulation, on the value of an optimal policy of this MDP. We investigate the performance of our ADPs on existing natural gas instances and new crude oil instances. Our approach has potential relevance for the approximate solution of MDPs that arise in the real option management of other commodity conversion assets, as well as the valuation and management of real and financial options that depend on forward curve dynamics.