This paper investigates the effect of oscillator phase noise on a multiuser millimeter wave (mmWave) massive MIMO uplink as we scale up the number of base station antennas, fixing the load factor, defined as the ratio of the number of simultaneous users to the number of base station antennas. We consider a modular approach in which the base station employs an array of subarrays, or "tiles." Each tile supports a fixed number of antennas, and can therefore be implemented using a separate radio frequency integrated circuit (RFIC), with synchronization across tiles accomplished by employing a phased locked loop in each tile to synthesize an on-chip oscillator at the carrier frequency by locking on to a common lower frequency reference clock. Assuming linear minimum mean squared error (LMMSE) multiuser detection, we provide an analytical framework that can be used to specify the required power spectral density (PSD) mask for phase noise for a target system performance. Our analysis for the phase noise at the output of the LMMSE receiver indicates two distinct effects: self-noise for each user which is inversely proportional to the number of tiles, and cross-talk between users which is insensitive to the number of tiles, and is proportional to the load factor. These analytical predictions, verified by simulations for a 140 GHz system targeting a per-user data rate of 10 Gbps, show that tiling is a robust approach for scaling. Numerical results for our proposed design approach yield relatively relaxed specifications for phase noise PSD masks.