The objective of this work is to provide a comprehensive understanding of the development of autonomous vehicle perception systems. So far, most autonomy perception research has been concentrated on improving perception systems' algorithmic quality or combining different sensor setups. In our work, we draw conclusions from participating in the Indy Autonomous Challenge 2021 and its follow-up event in Las Vegas 2022. These were the first head-to-head autonomous racing competitions that required an entire perception pipeline to perceive the environment and the opposing surrounding vehicles. Our research includes quantitative results from collected vehicle data and qualitative results from simulation, video, and multiple race analysis. The Indy Autonomous Challenge was one of the few research projects that considered the entire autonomous vehicle. Therefore, our findings indicate insights on the system level, including hardware setup and full-stack software. We can demonstrate that different sensor modalities in the vehicle have strengths and weaknesses when they are deployed. Our results further show the difficulties and challenges that emerge when multi-modal perception systems must run in real-time on real-world autonomous vehicles. The most concise finding from our investigation is the summary of critical learnings when developing and deploying perception systems for autonomous systems. Given the background of the study, it was inevitable that our conclusions were influenced by driving on the racetrack and only one hardware setup available. Therefore, in the discussion, we draw further parallels to driving on public roads in dense traffic. More studies are needed to investigate the development and deployment of multi-modal perception systems for autonomous road vehicles with different hardware setups and various object detection, localization, and prediction algorithms. The novel contributions of this work are given by 12 lessons learned, summarized in 5 categories. These were derived and validated through a realized real-world application project. The videos of the final events in Indianapolis and Las Vegas can be watched here: IAC: https://www.youtube.com/watch?v=ERTffn3IpIs&ab_channel=CNETHighlights AC@CES: https://www.youtube.com/watch?v=df9f4Qfa0uU&ab_channel=CNETHighlights Multiple modules of the software stack are open source: https://github.com/TUMFTM.INDEX TERMS Autonomous racing, autonomous vehicles, perception systems, software development.The associate editor coordinating the review of this manuscript and approving it for publication was Junho Hong . FIGURE 2. Main contribution of this work: the lessons learned presented.to develop, deploy, validate, and test a complete autonomous driving stack, from sensor data to vehicle actuation. Because of the challenge's tight schedule, software development had to begin before the vehicles were built. This made the simulation of the vehicles and the environment an important task to be successful in the final competition.The IAC took place on October 23, 2021, i...