We introduce "Life-Tags," a wearable, smartglasses-based system for abstracting life in the form of clouds of tags and concepts automatically extracted from snapshots of the visual reality recorded by wearable video cameras. Life-Tags summarizes users' life experiences using word clouds, highlighting the "executive summary" of what the visual experience felt like for the smartglasses user during some period of time, such as a specific day, week, month, or the last hour. In this paper, we focus on (i) design criteria and principles of operation for Life-Tags, such as its first-person, eye-level perspective for recording life, passive logging mode, and privacy-oriented operation, as well as on (ii) technical and engineering aspects for implementing Life-Tags, such as the block architecture diagram highlighting devices, software modules, third-party services, and dataflows. We also conduct a technical evaluation of Life-Tags and report results from a controlled experiment that generated 21,600 full HD snapshots from six indoor and outdoor scenarios, representative of everyday life activities, such as walking, eating, traveling, etc., with a total of 180 minutes of recorded life to abstract with tag clouds. Our experimental results and Life-Tags prototype inform design and engineering of future life abstracting systems based on smartglasses and wearable video cameras to ensure effective generation of rich clouds of concepts, reflective of the visual experience of the smartglasses user.