In this paper, we will re-elaborate the notions of filter bubble and of echo chamber by considering human cognitive systems’ limitations in everyday interactions and how they experience digital technologies. Researchers who applied the concept of filter bubble and echo chambers in empirical investigations see them as forms of algorithmically-caused systems that seclude the users of digital technologies from viewpoints and opinions that oppose theirs. However, a significant majority of empirical research has shown that users do find and interact with opposing views. Furthermore, we argue that the notion of filter bubble overestimates the social impact of digital technologies in explaining social and political developments without considering the not-only-technological circumstances of online behavior and interaction. This provides us with motivation to reconsider this notion’s validity and re-elaborate it in light of existing epistemological theories that deal with the discomfort people experience when dealing with what they do not know. Therefore, we will survey a series of philosophical reflections regarding the epistemic limitations of human cognitive systems. In particular, we will discuss how knowledge and mere belief are phenomenologically indistinguishable and how people’s experience of having their beliefs challenged is cause of epistemic discomfort. We will then go on to argue, in contrast with Pariser’s assumptions, that digital media users might tend to conform to their held viewpoints because of the “immediate” way they experience opposing viewpoints. Since online people experience others and their viewpoints as material features of digital environments, we maintain that this modality of confronting oneself with contrasting opinions prompts users to reinforce their preexisting beliefs and attitudes.