This thesis is comprised of two parts. The first presents several advances in Sequential Monte Carlo methods. A new class of sequential Monte Carlo methods called Nested Sampling via Sequential Monte Carlo (NS-SMC), which reframes the Nested Sampling method of Skilling in terms of sequential Monte Carlo techniques is introduced. In contrast to NS, the analysis of NS-SMC does not require the (unrealistic) assumption that the simulated samples be independent. This new framework allows one to obtain provably consistent and unbiased estimates of marginal likelihoods when Markov chain Monte Carlo (MCMC) is used to produce new samples. As the original NS algorithm is a special case of NS-SMC, this provides insights as to why NS seems to produce accurate estimates despite a typical violation of its assumptions. Novel calibration methods that apply generally to SMC Samplers are introduced, and applied in a numerical study where the performance of NS-SMC and temperature-annealed SMC is compared on several challenging and realistic statistical problems. The second part of the dissertation presents several novel Monte Carlo methods for the estimation of distributional quantities relating sums of random variables. For the sum of dependent log-normal random variables, novel estimators for the left tail (cumulative distribution function), the right tail (or complementary distribution function), and the probability density function are introduced. Numerical experiments demonstrate that in all three settings, our proposed methodology delivers accurate estimators in settings for which existing methods have large variance and tend to underestimate the quantity of interest. Theoretical efficiency results are presented for the left and right tail estimators, and a method for efficiently sampling dependent log-normal random variables conditional on a left tail rare event exactly is also presented. Finally, a novel estimator for estimating the probability density function of a sum of random variables in a more general setting is studied, which allows estimation of marginal probability density functions in the context of approximate sampling with MCMC.