In this paper, we present a low-cost multimodal tactile sensor capable of providing accelerometer, gyroscope and pressure data using a 7-axis chip as sensing element. This approach reduces the complexity of tactile sensor design and collection of multimodal data. The tactile device is composed of a top layer (a printed circuit board and sensing element), a middle layer (soft rubber material), and bottom layer (plastic base) forming a sandwich structure. This approach allows the measurement of multimodal data when force is applied on different parts of the top layer of the sensor. The multimodal tactile sensor is validated with analyses and experiments in both offline and real-time. First, the spatial impulse response and sensitivity of the sensor are analysed with accelerometer, gyroscope and pressure data systematically collected from the sensor. Second, estimation of contact location from a range of sensor positions and force values is evaluated using accelerometer and gyroscope data together with a Convolutional Neural Network (CNN) method. Third, the estimation of contact location is used to control the position of a robot arm. The results show that the proposed multimodal tactile sensor has the potential for robotic applications such as tactile perception for robot control, human-robot interaction and object exploration.