The switch Markov chain has been extensively studied as the most natural Markov Chain Monte Carlo approach for sampling graphs with prescribed degree sequences. We use comparison arguments with other-less natural but simpler to analyze-Markov chains, to show that the switch chain mixes rapidly in two different settings. We first study the classic problem of uniformly sampling simple undirected, as well as bipartite, graphs with a given degree sequence. We apply an embedding argument, involving a Markov chain defined by Jerrum and Sinclair (TCS, 1990) for sampling graphs that almost have a given degree sequence, to show rapid mixing for degree sequences satisfying strong stability, a notion closely related to P-stability. This results in a much shorter proof that unifies the currently known rapid mixing results of the switch chain and extends them up to sharp characterizations of P-stability. In particular, our work resolves an open problem posed by Greenhill (SODA, 2015).Secondly, in order to illustrate the power of our approach, we study the problem of uniformly sampling graphs for which-in addition to the degree sequence-a joint degree distribution is given. Although the problem was formalized over a decade ago, and despite its practical significance in generating synthetic network topologies, small progress has been made on the random sampling of such graphs. The case of a single degree class reduces to sampling of regular graphs, but beyond this almost nothing is known. We fully resolve the case of two degree classes, by showing that the switch Markov chain is always rapidly mixing. Again, we first analyze an auxiliary chain for strongly stable instances on an augmented state space and then use an embedding argument.