Assisted and Automated Driving functions are increasingly deployed to support improved safety, efficiency, and enhance driver experience. However, there are still key technical challenges that need to be overcome as the degradation of perception sensor data due to noise factors. The quality of data being generated by sensors can directly impact the planning and control of the vehicle, which in turn can affect the vehicle safety. A framework to analyse noise factor effects on automotive environmental perception sensors has been recently proposed and applied to study the effects of noise factors on LiDAR sensors. This work builds on this framework, and deploys it to camera sensors, focusing on the specific disturbed sensor outputs via a detailed analysis and classification of camera specific noise sources. Moreover, the noise factor analysis has been used to identify two omnipresent and independent noise factors (i.e. obstruction and windshield distortion). These noise factors have been modelled to generate noisy camera data; their impact on the perception step, based on deep neural networks, has been evaluated when the noise factors are applied independently and simultaneously. It is demonstrated that the performance degradation from the combination of noise factors is not simply the accumulated performance degradation from each single factor, which raises the importance of including combination noise factor modelling and testing for performance analysis. Thus, through the findings here, the framework can enhance the use of simulation for development and testing of automated vehicles through careful consideration of the noise factors affecting camera data.