In nature, crabs have a panoramic vision for the localization and perception of accelerating motion from local segments to global view in order to guide reactive behaviours including escape. The visual neuronal ensemble in crab plays crucial roles in such capability, however, has never been investigated and modelled as an artificial vision system. To bridge this gap, we propose an accelerating motion perception model (AMPM) mimicking the visual neuronal ensemble in crab. The AMPM includes two main parts, wherein the presynaptic network from the previous modelling work simulates 16 MLG1 neurons covering the entire view to localize moving objects. The emphasis herein is laid on the original modelling of MLG1s' post-synaptic network to perceive accelerating motions from a global view, which employs a novel spatialtemporal difference encoder (STDE), and an adaptive spiking threshold temporal difference encoder (AT-TDE). Specifically, the STDE transforms "time-to-travel" between activations of two successive segments of MLG1 into excitatory postsynaptic current (EPSC), which decays with the elapse of time. The AT-TDE in two directional, i.e., counter-clockwise and clockwise accelerating detectors guarantees "non-firing" to constant movements. Accordingly, the accelerating motion can be effectively localized and perceived by the whole network. The systematic experiments verified the feasibility and robustness of the proposed method. The model responses to translational accelerating motion also fit many of the explored physiological features of direction selective neurons in the lobula complex of crab (i.e. lobula complex direction cells, LCDCs). This modelling study not only provides a reasonable hypothesis for such biological neural pathways, but is also critical for developing a new neuromorphic sensor strategy.