Persuasion, defined as the act of exploiting an informational advantage in order to effect the decisions of others, is ubiquitous. Indeed, persuasive communication has been estimated to account for almost a third of all economic activity in the US. This paper examines persuasion through a computational lens, focusing on what is perhaps the most basic and fundamental model in this space: the celebrated Bayesian persuasion model of Kamenica and Gentzkow [34]. Here there are two players, a sender and a receiver. The receiver must take one of a number of actions with a-priori unknown payoff, and the sender has access to additional information regarding the payoffs of the various actions for both players. The sender can commit to revealing a noisy signal regarding the realization of the payoffs of various actions, and would like to do so as to maximize her own payoff in expectation assuming that the receiver rationally acts to maximize his own payoff. When the payoffs of various actions follow a joint distribution (the common prior), the sender's problem is nontrivial, and its computational complexity depends on the representation of this prior.We examine the sender's optimization task in three of the most natural input models for this problem, and essentially pin down its computational complexity in each. When the payoff distributions of the different actions are i.i.d. and given explicitly, we exhibit a polynomial-time (exact) algorithmic solution, and a "simple" (1 − 1/e)-approximation algorithm. Our optimal scheme for the i.i.d. setting involves an analogy to auction theory, and makes use of Border's characterization of the space of reduced-forms for single-item auctions. When action payoffs are independent but non-identical with marginal distributions given explicitly, we show that it is #P-hard to compute the optimal expected sender utility. In doing so, we rule out a generalized Border's theorem, as defined by Gopalan et al [30], for this setting. Finally, we consider a general (possibly correlated) joint distribution of action payoffs presented by a black box sampling oracle, and exhibit a fully polynomial-time approximation scheme (FPTAS) with a bi-criteria guarantee. Our FPTAS is based on Monte-Carlo sampling, and its analysis relies on the principle of deferred decisions. Moreover, we show that this result is the best possible in the black-box model for information-theoretic reasons.A somewhat less artificial example of persuasion is in the context of providing financial advice. Here, the receiver is an investor, actions correspond to stocks, and the sender is a stockbroker or financial adviser with access to stock return projections which are a-priori unknown to the investor. When the adviser's commission or return is not aligned with the investor's returns, this is a nontrivial Bayesian persuasion problem. In fact, interesting examples exist when stock returns are independent from each other, or even i.i.d. Consider the following simple example which fits into the i.i.d. model considered in Section 3: the...