With the growing demand for latency-critical and computation-intensive Internet of Things (IoT) services, mobile edge computing (MEC) has emerged as a promising technique to reinforce the computation capability of the resource-constrained mobile devices. To exploit the cloud-like functions at the network edge, service caching has been implemented to (partially) reuse the computation tasks (e.g., input/output data and program files etc.), thus effectively reducing the delay incurred by data retransmissions and/or the computation burden due to repeated execution of the same task. In a multiuser cache-assisted MEC system, designs for service caching depend on users' preference for different types of services, which is at times highly correlated to the locations where the requests are made. In this paper, we exploit users' location-dependent service preference profiles to formulate a cache placement optimization problem in a multiuser MEC system. Specifically, we consider multiple representative locations, where users at the same location share the same preference profile for a given set of services. In a frequency-division multiple access (FDMA) setup, we jointly optimize the binary cache placement, edge computation resources and bandwidth allocation to minimize the expected weighted-sum energy of the edge server and the users with respect to the users' preference profile, subject to the bandwidth and the computation limitations, and the latency constraints. To effectively solve the mixed-integer non-convex problem, we propose a deep learning based offline cache placement scheme using a novel stochastic quantization based discrete-action generation method. In special cases, we also attain suboptimal caching decisions with low complexity leveraging the structure of the optimal solution. The simulations verify the performance of the proposed scheme and the effectiveness of service caching in general.