Individual pig tracking is key to stepping away from group-level treatment and towards individual pig care. By doing so we can monitor individual pig behaviour changes over time and use these as indicators of health and well-being, which, in turn, will assist in the early detection of disease allowing for earlier and more effective intervention. However, it is a much more computationally challenging than performing this task at group level; mistakes in identification and tracking accumulate and, over time, provide noise measures. We combine a deep CNN object localisation method, Faster Region-based convolutional neural network (R-CNN), with two potential real-time multi-object tracking methods in order to create a complete system that can autonomously localise and track individual pigs allowing for the extraction of metrics pertaining to individual pig behaviours from RGB cameras. We evaluate two different transfer learning strategies to adapt Faster R-CNN to our pig detection dataset that is more challenging than conventional tracking benchmark datasets. We are able to localise pigs in individual frames with 0.901 mean average precision (mAP), which then allows us to track individual pigs across video footage with 92% Multi-Object Tracking Accuracy (MOTA) and 73.4% Identity F1-Score (IDF1), and re-identify them after occlusions and dropped frames with 0.862 mAP (0.788 Rank 1 cumulative matching characteristic (CMC)). From these tracks we extract individual behavioural metrics for total distance travelled, time spent idle, and average speed with less than 0.015 mean squared error (MSE) for each. Changes in all these behavioural metrics have value in the detection of pig health and wellbeing.