The urban rainstorm can evolve into a serious emergency, generally characterized by high complexity, uncertainty, and time pressure. It is often difficult for individuals to find the optimal response strategy due to limited information and time constraints. Therefore, the classical decision-making method based on the “infinite rationality” assumption is sometimes challenging to reflect the reality. Based on the recognition-primed decision (RPD) model, a dynamic RPD (D-RPD) model is proposed in this paper. The D-RPD model assumes that decision-makers can gain experience in the escaping process, and the risk perception of rainstorm disasters can be regarded as a Markov process. The experience of recent attempts would contribute more in decision-making. We design the agent according to the D-RPD model, and employ a multi-agent system (MAS) to simulate individuals’ decisions in the context of a rainstorm. Our results show that experience helps individuals to perform better when they escape in the rainstorm. Recency acts as a one of the key elements in escaping decision making. We also find that filling the information gap between individuals and real-time disaster would help individuals to perform well, especially when individuals tend to avoid extreme decisions.