The Shannon information entropy theory is used to explain the recently proposed isobaric yield ratio difference (IBD) probe which aims to determine the nuclear symmetry energy. Theoretically, the difference between the Shannon uncertainties carried by isobars in two different reactions ( In 21 ), is found to be equivalent to the difference between the chemical potentials of protons and neutrons of the reactions [the IBD probe, IB-(βμ) 21 , with β the reverse temperature]. From the viewpoints of Shannon information entropy, the physical meaning of the above chemical potential difference is interpreted by In 21 as denoting the nuclear symmetry energy or density difference between neutrons and protons in reactions more concisely than from the statistical ablation-abrasion model.(http://creativecommons.org/licenses/by/4.0/). Funded by SCOAP 3 .Nuclear matter with different density range from sub-saturation to supra-saturation can be produced in heavy-ion collisions (HICs). Because of the difficulty to measure the nuclear density and nuclear symmetry energy directly, various probes have been proposed to study the nuclear property based on different models. The results of these probes differ from each other in different extent both theoretically and experimentally [1][2][3][4][5][6][7]. The entire process of HICs is dynamical, in which the nuclear matter experience the hot and high density state by violent compressing between projectile and target nuclei, and the dilute states in the process of system expanding. At last, the final residue fragments, which cease to emit particle anymore and are chemically frozen, are measured. Many probes to investigate the nuclear property in HICs are based on the yield of fragments [1,[8][9][10][11][12][13][14][15][16].The Shannon information entropy, which was put forward by C.E. Shannon, is to measure the uncertainty in a random variable which quantifies the expected value of the information contained in a message [17]. The Shannon entropy tells the average unpredictability in a random variable, which is equivalent to its information content, and provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference called as the maximumentropy estimate [18]. In the information communication, the