Autonomous ships are expected to improve the level of safety and efficiency in future maritime navigation. Such vessels need perception for two purposes: to perform autonomous situational awareness and to monitor the integrity of the sensor system itself. In order to meet these needs, the perception system must fuse data from novel and traditional perception sensors using Artificial Intelligence (AI) techniques. This article overviews the recognized operational requirements that are imposed on regular and autonomous seafaring vessels, and then proceeds to consider suitable sensors and relevant AI techniques for an operational sensor system. The integration of four sensors families is considered: sensors for precise absolute positioning (Global Navigation Satellite System (GNSS) receivers and Inertial Measurement Unit (IMU)), visual sensors (monocular and stereo cameras), audio sensors (microphones), and sensors for remotesensing (RADAR and LiDAR). Additionally, sources of auxiliary data, such as Automatic Identification System (AIS) and external data archives are discussed. The perception tasks are related to well-defined problems, such as situational abnormality detection, vessel classification, and localization, that are solvable using AI techniques. Machine learning methods, such as deep learning and Gaussian processes, are identified to be especially relevant for these problems. The different sensors and AI techniques are characterized keeping in view the operational requirements, and some example state-of-the-art options are compared based on accuracy, complexity, required resources, compatibility and adaptability to maritime environment, and especially towards practical realization of autonomous systems.