Speaker
Description
The High Energy Photon Source (HEPS) is estimated to produce a substantial volume of raw data, presenting significant computational challenges in scientific research. To address this problem, we have developed a high-throughput data I/O framework specifically tailored for HEPS, aimed at mitigating the I/O bottlenecks.Firstly, within this framework, we have devised a unified I/O interface for computational tasks, which serves to shield the difference in underlying data sources and formats. Subsequently, an asynchronous prefetch method has been integrated into the framework to expedite data read and write speeds. This includes the dynamic adjustment of prefetch data volume based on computational tasks and memory space, thereby optimizing the utilization of computational node memory space. Lastly, in order to overcome the issue of slow data access resulting from the process of writing data to disk and subsequent reading, the framework has been extended to encompass a streaming data module. This module dynamically parses data streams from the DAQ and stores them in a distributed cache pool, thereby further accelerating the data retrieval process through the utilization of data streams.
Abstract publication | I agree that the abstract will be published on the web site |
---|