Humans perceive gravitational forces on their surroundings through a mix of visual and sensorimotor cues. The accurate presentation of such cues is a difficult task in Mixed/Augmented Reality (MR/AR), technological paradigms that blend physical and virtual elements to enhance the way we interact with our environment. Realistically perceiving the weight of virtual elements within a MR/AR scenario aids in the embodiment of those elements within the user's reality, further blurring the lines between what is real and virtual. Unfortunately, current force feedback devices are not designed for or are entirely compatible with MR/AR experiences. To address this need, we explore minimal haptic feedback for weight perception in MR/AR, aiming to simplify the rendering of gravitational cues that are crucial to an immersive experience. Our benchtop device, focused on wrist feedback, showed improved user experience even within an implicit weight feedback task, i.e., a task where weight perception was not required for task completion. However, challenges arose in mixed real-virtual environments, a cornerstone of MR/AR interaction, where weight discrimination was observed to be less accurate. To address this, we developed a compensation scheme for virtual weights, leading to performance on par with a purely virtual environment. Our work demonstrates the viability of minimal haptic feedback in MR/AR applications and highlights the importance of integrating weight perception for increased realism. Our work also fills a research gap in MR/AR development, providing insights for designing future MR/AR systems that integrate with human sensory mechanisms to create virtual interactions that more closely mirror the physical world.