The collective modes of two-dimensional ordered atomic arrays can modify the radiative environment of embedded atomic impurities. We analyze the role of the lattice geometry on the impurity's emission linewidth by comparing the effective impurity decay rate obtained for all non-centered Bravais lattices and an additional honeycomb lattice. We demonstrate that the lattice geometry plays a crucial role in determining the effective decay rate for the impurity. In particular, we find that the minimal effective decay rate appears in lattices where the number of the impurity's nearest neighbors is maximal and the number of distinct distances among nearest neighbors is minimal. We further show that, in the choice between interstitial and substitutional placement of the impurity, the former always wins by exhibiting a lower decay rate and longer photon storage. For interstitial placements, we determine the optimal impurity position in the lattice plane, which is not necessarily found in the center of the lattice plaquette.