Lightning is an impressive, widespread natural phenomenon, yet many
questions about its physics remain unanswered due to its extreme speed,
transience, and high energy. These intrinsic characteristics make
optical observations capturing its formation, propagation, and discharge
challenging with conventional optical cameras. Furthermore, optical
sensors with extremely high-speed frame rates and high dynamic ranges
are needed. While high speed cameras have been used to capture
lightning, their lack of portability, high cost and high data storage
requirements can limit lightning research. To address these challenges,
the use of neuromorphic technologies, inspired by the sensing and data
processing mechanisms of biological photoreceptors, offers a unique
approach. Event-based vision sensors offer low latency, less power than
a conventional camera, and have sensing capabilities that operate across
a dynamic range of over 120dB. This paper demonstrates the effectiveness
of Event-Based Vision Sensor in lightning research by presenting data
collected during a full lightning storm and provides examples of how
event-based data can be used to interpret various lightning features. We
used a Prophesee Gen4 Event-Based Vision Sensor to record a thunderstorm
over a fifty-minute span on 24 January 2023, from Western Sydney, New
South Wales, Australia. During this observation, we recorded numerous
Cloud-to-Ground and Cloud-to-Cloud lightning strikes. To assess the
Event-Based Vision Sensor’s effectiveness in capturing commonly observed
lightning features, we used custom algorithms and in-house
post-processing software was used to analyze and interpret the data. We
conclude that the Event-Based Vision Sensor has the potential to improve
high-speed imagery due to its lower cost, data output, and ease of
deployment, ultimately establishing it as an excellent complementary
tool for lightning observation.