Autonomous robots in logistics are a promising approach towards a fully automated material flow. In order to use their full potential however, they must be able to extract semantic information from logistics environments. In contrast to other application areas of autonomous robots (e.g. autonomous driving, service robotics) the logistics domain lacks a common dataset and benchmark suite covering multiple sensor modalities and perception tasks. This paper conceptualizes a framework for artificial perception research in logistics that aims to close this gap in a sustainable, data-driven way. Our framework consists of three components: (1) A foundation, based on logistics-specific standards, concepts and requirements. ( 2) An open dataset, covering multiple sensor modalities and perception tasks and (3) a standardized benchmark suite. As shown in other research areas, a common and open platform for datadriven research facilitates novel developments and makes results comparable and traceable over time.