Conveners
Workflow engines: Workflow Engines
- Paolo Mutti (Insitut Laue-Langevin)
The increasing complexity and speed of experiments at synchrotrons call for the need of efficient and well-established automated data processing pipelines. Workflows are the ideal approach for defining these pipelines given their ability to describe data processing recipes. This led to the emergence of many workflow systems, such as tomwer
at tomography beamlines and pypushflow
at MX...
Data are essential to the scientific discoveries enabled by experiments performed at the APS. When the facility resumes operation this year, it will generate an estimated 100PB of raw experimental data per year from its seventy-two operating beamlines that house over 100 unique instruments. This data is generated as a part of over 6,000 annual experiments performed by over 5,500 facility users...
Building resilient data streams for large-scale experiments is a critical problem in modern settings in which advanced computing techniques are more tightly integrated with data collection activities. Resilience-aware application solutions must include 1) policy management, in which science-level goals are presented to the system; 2) data movement telemetry, which captures system responses;...
With [FAIR principles][1] increasing in importance within many fields, the challenge of ensuring that these principles are fully embedded in research outputs applies not just to the (meta)data itself, but also to the methods used to process and analyse it. If metadata associated with the raw data is lost in the analysis process, the Findability (which relies on this metadata) may be...