In cyber-physical systems, as in 5G and beyond, multiple physical processes require timely online monitoring at a remote device. There, the received information is used to estimate current and future process values. When transmitting the process data over a communication channel, source-channel coding is used in order to reduce data errors. During transmission, a high data resolution is helpful to capture the value of the process variables precisely. However, this typically comes with long transmission delays reducing the utilizability of the data, since the estimation quality gets reduced over time. In this paper, the trade-off between having recent data and precise measurements is captured for a Gauss-Markov process. An Ageof-Information (AoI) metric is used to assess data timeliness, while mean square error (MSE) is used to assess the precision of the predicted process values. AoI appears inherently within the MSE expressions, yet it can be relatively easier to optimize. Our goal is to minimize a time-averaged version of both metrics. We follow a short blocklength source-channel coding approach, and optimize the parameters of the codes being used in order to describe an achievability region between MSE and AoI.