Attritable sensor systems with microphones are ubiquitous and can detect a variety of moving targets from ground vehicles to unmanned autonomous systems (UAS), aircraft, and spacecraft. Smartphone platforms have open data collection interfaces and can use cellular, Wi-Fi, and other public communications to relay data. In addition, acoustic signatures are more difficult to mask than transmissions in radio-frequency bands, making sound a passive complement to RF in an early warning system. This presentation discusses the development, deployment, and ongoing AI/ML work behind RedVox, a smartphone app available through the Apple App Store and Google Play Store. RedVox records data using the smartphone's onboard sensors, including the accelerometer, gyroscope, magnetometer, barometer, and microphone; these data packets can be streamed to interested parties in real-time and analyzed online using RedVox's open-source software tools. One of RedVox's unique features is the ability to collect infrasonic data at frequencies below human hearing, which has been shown to detect natural disasters, explosions, and motorized vehicles. The RedVox app is rapidly deployable at scale in regions of interest. Two current focus areas are Guam and Coconut Island in Oahu, where networks of smartphones can perform edge processing and dispersed sensing of airborne and maritime targets. The RedVox ecosystem has matured over the past decade across diverse environments and currently provides advanced edge and cloud analytics. Further development could lead to a lightweight machine learning model that flags anomalous entities for human review, offering a resilient ad hoc sensor network for defense or emergency response applications.