Even today when nearly 80 years have passed after the atomic bomb (A-bomb) was dropped, there are still debates about the exact doses received by the A-bomb survivors. While initial airborne kerma radiation (or energy spectrum of emitted radiation) can be measured with sufficient accuracy to assess the radiation dose to A-bomb survivors, it is not easy to accurately assess the neutron dose including appropriate weighting of neutron absorbed dose. Particularly, possible post-explosion exposure due to the radioactive particles generated through neutron activation have been almost neglected so far, mainly because of a large uncertainty associated to the behavior of those particles. However, it has been supposed that contribution of such non-initial radiation exposure from the neutron-induced radioactive particles could be significant, according to the findings that the stable chromosomal aberration rates which indicate average whole-body radiation doses were found to be more than 30% higher for those exposed indoors than for those outdoors even at the same initial dose estimated for the Life Span Study. In this Mini Review article, the authors explain that such apparently controversial observations can be reasonably explained by assuming a higher production rate of neutron-induced radioactive particles in the indoor environment near the hypocenter.