Animal behavior can be an indicator of animal productivity and well-being, and thus an indicator of how animals respond to changes in their biophysical environment. This study monitored the behaviors of sows and piglets in a commercial setting utilizing an autonomous machine vision system. The objectives of this research were to: (1) implement a digital and time-of-flight depth imaging system, (2) develop a process with minimal user input to analyze the collected images, and (3) calculate the hourly and daily posture and behavior budgets of sows housed in individual farrowing stalls. Depth sensors were centered above each stall in three farrowing rooms (20 sows per room) and controlled by mini-PCs, acquiring images continuously at 0.2 FPS. Data files were transmitted via Ethernet cable to a switch, then to a 50 TB disk station for storage. Recorded image data were subsequently analyzed to quantify sow posture budgets and behaviors using a computer processing algorithm. Algorithm classifications were compared to those of trained human labelers with sow posture classified correctly >99.2% (sitting: 99.4%, standing: 99.2%, kneeling: 99.7%, lying: 99.9%). Specificity and sensitivity parameters for posture classifications were 2 >84.6%, with the exception of lower specificity for kneeling (20.5%). When lying, direction (sow lying on left or right side of body) was classified with an accuracy of 96.2%. Sows that were not lying were also labeled with a behavior, including feeding (97.0% accuracy), drinking behavior (96.8% accuracy), and other behavior (95.5% accuracy). Each non-lying behavior label had specificity >88.3% and sensitivity >77.4%. This autonomous system enables acquisition of a large amount of replicated data to evaluate the effects of changing the farrowing environment on sow behavior and potentially well-being.