Theoretical particle physicists continue to push the envelope in both high performance computing and in managing and analyzing large data sets. For example, the goals of sub-percent accuracy in predictions of quantum chromodynamics (QCD) using large scale simulations of lattice QCD and in finding signals of rare events and new physics in exabytes of data produced by experiments at the high luminosity large hadron collider (LHC) require new tools beyond just developments in hardware. Machine learning and artificial intelligence offer the promise of dramatically reducing the computational cost and time. This chapter reviews selected areas where AI/ML tools could have a major impact, provides an overview of the challenges, and discusses how new ideas such as normalizing flows can speed up the generation of gauge configurations needed in lattice QCD calculations; the growth of ML in surrogate models and pattern matching to reduce the cost of event generators and in the analysis of experimental data; and in the search for viable vacua in the landscape of string theories. While such approaches transform aspects of particle theory into computational problems, and thus black boxes, we argue that physics-aware development of these tools combined with algorithms that ensure that the results are bias free will continue to require a deep understanding of the physics. We see this broader transformation as akin to formulating and extracting observables from simulations of lattice QCD, a numerical integration of the path integral formulation of QCD that nevertheless requires a deep understanding of the underlying quantum field theory, the standard model of particle physics and effective field theory methods.