The conventional channel resolvability problem refers to the determination of the minimum rate required for an input process so that the output distribution approximates a target distribution in either the total variation distance or the relative entropy. In contrast to previous works, in this paper, we use the (normalized or unnormalized) Rényi divergence (with the Rényi parameter in [0, 2] ∪ {∞}) to measure the level of approximation. We also provide asymptotic expressions for normalized Rényi divergence when the Rényi parameter is larger than or equal to 1 as well as (lower and upper) bounds for the case when the same parameter is smaller than 1. We characterize the Rényi resolvability, which is defined as the minimum rate required to ensure that the Rényi divergence vanishes asymptotically. The Rényi resolvabilities are the same for both the normalized and unnormalized divergence cases. In addition, when the Rényi parameter smaller than 1, consistent with the traditional case where the Rényi parameter is equal to 1, the Rényi resolvability equals the minimum mutual information over all input distributions that induce the target output distribution. When the Rényi parameter is larger than 1 the Rényi resolvability is, in general, larger than the mutual information. The optimal Rényi divergence is proven to vanish at least exponentially fast for both of these two cases, as long as the code rate is larger than the Rényi resolvability. The optimal exponential rate of decay for i.i.d. random codes is also characterized exactly. We apply these results to the wiretap channel, and completely characterize the optimal tradeoff between the rates of the secret and non-secret messages when the leakage measure is given by the (unnormalized) Rényi divergence. This tradeoff differs from the conventional setting when the leakage is measured by the traditional mutual information.