Decentralized detection is studied for parallel-access sensor networks, where sensor statistics are not known completely and are assumed to follow distribution functions which belong to known uncertainty classes. It is shown that there exist no minimax robust tests over the deterministic decision rules for the uncertainty classes built with respect to the Kullback-Leibler (KL)-divergence. For the KL-divergence as well as for some other uncertainty classes, such as the α-divergences, the joint stochastic boundedness property, which is the fundamental rule to prove minimax robustness, fails to hold. This raises a natural question whether a solution to minimax robust decentralized detection problem can be given if the uncertainty classes do not own this property. An answer to this question has been shown to be positive, which leads to a generalization of an existing work. Moreover, it is shown that for Huber's extended uncertainty classes quantization functions at the sensors are not required to be monotone in order to claim minimax robustness. A possible generalization of the theory to minimax-and Neyman-Pearson formulations, repeated observations, imperfect reporting channels and different network topologies have been discussed. Simulation examples are provided considering clipped-and censored likelihood ratio tests.