Autonomic computing is emerging as a significant new approach to the design of computer services. Its goal is the development of services that are able to manage themselves with minimal direct human intervention, and, in particular, are able to sense their environment and to tune themselves to meet end-user needs. However, the impact on performance of the interaction between multiple uncoordinated self-optimizing services is not yet well understood. We present some recent results on a non-cooperative load-balancing game which help to better understand the result of this interaction. In this game, users generate jobs of different services, and the jobs have to be processed on one of the servers of a computing platform. Each service has its own dispatcher which probabilistically routes jobs to servers so as to minimize the mean processing cost of its own jobs. We first investigate the impact of heterogeneity in the amount of incoming traffic routed by dispatchers and present a result stating that, for a fixed amount of total incoming traffic, the worstcase overall performance occurs when each dispatcher routes the same amount of traffic. Using this result we then study the so-called Price of Anarchy (PoA), an oft-used worstcase measure of the inefficiency of non-cooperative decentralized architectures. We give explicit bounds on the PoA for cost functions representing the mean delay of jobs when the service discipline is PS or SRPT. These bounds indicate that significant performance degradations can result from the selfish behavior of self-optimizing services. In practice, though, the worst-case scenario may occur rarely, if at all. Some recent results suggest that for the game under consideration the PoA is an overly pessimistic measure that does not reflect the performance obtained in most instances of the problem.