Highly unstable environments can be modeled by arbitrarily varying (information) sources (AVS). We conduct a study of multiple hypothesis testing (HT) for those sources within two approaches existing in an information-theoretic area of statistical analysis. First we characterize the attainable exponent trade-offs for all kind of error probabilities and indicate the corresponding decision schemes or testing strategies. Then we treat the same problem from an optimality achieving perspectives.Moreover, Chernoff bounds for both the binary and M -ary HT are specified via indication of a Sanov theorem for AVS's. Additional geometric interpretations help to digest the structure of HT in derived solutions.