Assessing comprehension difficulties requires the ability to assess cognitive load. Changes in cognitive load induced by comprehension difficulties could be detected with an adequate time resolution using different biofeedback measures (e.g., changes in the pupil diameter). However, identifying the Spatiotemporal sources of content comprehension difficulties (i.e., when, and where exactly the difficulty occurs in content regions) with a fine granularity is a big challenge that has not been explicitly addressed in the state-of-the-art. This paper proposes and evaluates an innovative approach named Intelligent Biofeedback Augmented Content Comprehension (TellBack) to explicitly address this challenge. The goal is to autonomously identify regions of digital content that cause user's comprehension difficulty, opening the possibility to provide real-time comprehension support to users. TellBack is based on assessing the cognitive load associated with content comprehension through non-intrusive cheap biofeedback devices that acquire measures such as pupil response or Heart Rate Variability (HRV). To identify when exactly the difficulty in comprehension occurs, physiological manifestations of the Autonomic Nervous System (ANS) such as the pupil diameter variability and the modulation of HRV are exploited, whereas the fine spatial resolution (i.e., the region of content where the user is looking at) is provided by eye-tracking. The evaluation results of this approach show an accuracy of 83.00% ± 0.75 in classifying regions of content as difficult or not difficult using Support Vector Machine (SVM), and precision, recall, and micro F1-score of 0.89, 0.79, and 0.83, respectively. Results obtained with 4 other classifiers, namely Random Forest, knearest neighbor, Decision Tree, and Gaussian Naïve Bayes, showed a slightly lower precision. TellBack outperforms the state-of-the-art in precision & recall by 23% and 17% respectively.