Quantization of signals is an integral part of modern signal processing applications, such as sensing, communication, and inference. While signal quantization provides many physical advantages, it usually degrades the subsequent estimation performance that is based on quantized data. In order to maintain physical constraints and simultaneously bring substantial performance gain, in this work we consider systems with mixedresolution, 1-bit quantized and continuous-valued, data. First, we describe the linear minimum mean-squared error (LMMSE) estimator and its associated mean-squared error (MSE) for the general mixed-resolution model. However, the MSE of the LMMSE requires matrix inversion in which the number of measurements defines the matrix dimensions and thus, is not a tractable tool for optimization and system design. Therefore, we present the linear Gaussian orthonormal (LGO) measurement model and derive a closed-form analytic expression for the MSE of the LMMSE estimator under this model. In addition, we present two common special cases of the LGO model: 1) scalar parameter estimation and 2) channel estimation in mixed-ADC multiple-input multiple-output (MIMO) communication systems. We then solve the resource allocation optimization problem of the LGO model with the proposed tractable form of the MSE as an objective function and under a power constraint using a one-dimensional search. Moreover, we present the concept of dithering for mixed-resolution models and optimize the dithering noise as part of the resource allocation optimization problem for two dithering schemes: 1) adding noise only to the quantized measurements and 2) adding noise to both measurement types. Finally, we present simulations that demonstrate the advantages of using mixed-resolution measurements and the possible improvement introduced with dithering and resource allocation.