Continued scaling of bulk CMOS technology is facing formidable challenges. FinFETs, with better control of short-channel effects, offer a promising alternative for the 22-nm technology node and beyond. However, FinFETs still suffer from process, voltage, and temperature (PVT) variations. Thus, to analyze the delay of FinFET logic circuits, statistical static timing analysis (SSTA) is more suitable than traditional static timing analysis. In this paper, using an existing SSTA algorithm as a foundation, we analyze silicon-on-insulator FinFET circuits in various new ways: 1) basing the analysis on accurate device simulation of the logic library; 2) considering voltage and temperature variations, in addition to process variations; 3) deriving accurate timing and leakage macromodels; and 4) investigating the impact of PVT variations on delay/leakage distributions at the circuit level. We propose a simplified timing model that greatly reduces the computational complexity without giving rise to any convergence issues. The timing model has an average absolute error of 3.4% and 4.4%, respectively, for gate output slope and gate delay over all logic gates and sizes, compared with accurate quasi-Monte Carlo simulations. We evaluate the performance of our SSTA algorithm with respect to Monte Carlo simulation, and extend the algorithm to enable statistical leakage and dynamic power analysis as well. We investigate the impact of PVT variations on delay/power distributions at the circuit level. We show that deterministic optimization methods can optimize both the mean and variance of circuit delay/power distributions, and even the ratio of standard deviation to mean in some cases. Finally, we show that FinFET circuits need to be carefully optimized with temperature taken into consideration, since the ratio between the leakage and dynamic power of a circuit can vary drastically depending on the operating temperature assumed.