Measuring an array of variables is central to many systems, including imagers (array of pixels), spectrometers (array of spectral bands) and lighting systems. Each of the measurements, however, is prone to noise and potential sensor saturation. It is recognized by a growing number of methods that such problems can be reduced by multiplexing the measured variables. In each measurement, multiple variables (radiation channels) are mixed (multiplexed) by a code. Then, after data acquisition, the variables are decoupled computationally in post processing. Potential benefits of the use of multiplexing include increased signal-to-noise ratio and accommodation of scene dynamic range. However, existing multiplexing schemes, including Hadamard-based codes, are inhibited by fundamental limits set by sensor saturation and Poisson distributed photon noise, which is scene dependent. There is thus a need to find optimal codes that best increase the signal to noise ratio, while accounting for these effects. Hence, this paper deals with the pursuit of such optimal measurements that avoid saturation and account for the signal dependency of noise. The paper derives lower bounds on the mean square error of demultiplexed variables. This is useful for assessing the optimality of numerically-searched multiplexing codes, thus expediting the numerical search. Furthermore, the paper states the necessary conditions for attaining the lower bounds by a general code.We show that graph theory can be harnessed for finding such ideal codes, by the use of strongly regular graphs.