A math word problem (MWP) is a coherent narrative which reflects the underlying logic of math equations. Successful MWP generation can automate the writing of mathematics questions. Previous methods mainly generate MWP text based on inflexible pre-defined templates. In this paper, we propose a neural model for generating MWP text from math equations. Firstly, we incorporate a matching model conditioned on the domain knowledge to retrieve a MWP instance which is most consistent with the ground-truth, where the domain is a latent variable extracted with a domain summarizer. Secondly, by constructing a Quantity Cell Graph (QCG) from the retrieved MWP instance and reasoning over it, we improve the model's comprehension of real-world scenarios and derive a domainconstrained instance sketch to guide the generation. Besides, the QCG also interacts with the equation encoder to enhance the alignment between math tokens (e.g., quantities and variables) and MWP text. Experiments and empirical analysis on educational MWP set show that our model achieves impressive performance in both automatic evaluation metrics and human evaluation metrics. Equations: x = 6 * y ; (x + y) * 3= 147 Problem: Jane travels 6 times faster than mike. Traveling in opposite directions they are 147 miles apart after 3 hrs. Find their rates of travel. Equations: (1 -1/3 -9/20) * x = 245 Problem: At a local high school, 3/8 of the students are freshman. 1/4 are juniors. And 245 are seniors. Find the total number of students.