The High Energy Photon Source (HEPS) is expected to produce a substantial volume of data, lead to immense data I/O pressure during computing. Inefficient data I/O can significantly impact computing performance.
To address this challenge, firstly, we have developed a data I/O framework for HEPS. This framework consists of three layers: data channel layer, distributed memory management layer, and I/O interface layer. It mask the underlying data differences in formats and sources, while implementing efficient I/O methods. Additionally, it supports both stream computing and batch computing.
Secondly, we have designed a data processing pipeline scheme aimed at reducing I/O latency and optimizing I/O bandwidth utilization during the processing of high-throughput data. This involves breaking down the computing task into several stages, including data loading, data pre-processing, data processing, and data writing, which are executed asynchronously and in parallel.
Finally, we introduce the design of stream data I/O process. The primary objective of stream data I/O is to enable real-time online processing of raw data, avoiding I/O bottlenecks caused by disk storage. This approach ensures the stability of data transmission and integrates distributed memory management to guarantee data integrity in memory.