https://indico.esrf.fr/event/137/
Please follow the link for a detailed description and the program for the meeting!
The Ada Lovelace Centre (ALC) was setup to maximise the utilization and impact of data from the STFC large facilities, Diamond Light Source, ISIS Neutron and Muon Source and the Central Laser Facility.
Our activities cover data management, cloud-based services, materials modelling, computational biology, imaging, and the application of maths and AI to these areas and to science and...
Due to the diverse data acquisition modes and complex online analysis methods conducted at various beamlines of synchrotron radiation light sources, beamline users are often required to get acquainted with the interface, functionality and workflow of the data acquisition software before the experiment starts. Such process highly relies on the on-site guidance from the beamline staff themselves...
From the Bluesky project, Tiled [1] is a data service that removes several barriers by providing secure, authenticated remote access to data. Tiled abstracts variations in file formats and other data storage details across different beamline instruments, making data analysis and visualization code portable. It enables fast, targeted access to specific data regions and offers search and...
Synchrotron light source facilities are evolving into the fourth generation with extreme spatial, temporal and energy resolving capabilities, which pushes the transition of experiment modes into high resolution, multiscale, ultra-fast, and in-situ characterization with dynamic loading or under operando conditions. Such transition raises challenges to balance acquisition efficiency and data...
An entire scientific workflow from acquisition through analysis has been automated, optimizing the success rate of measurements at the APS. Optimizations comprise leveraging a custom Python software stack integrating Bluesky (with connections to EPICS), the use of a large language model Scientific Companion, and the APS Data Management tools.
The integration simplifies the learning needed...
In this presentation, we review several approaches to crashproofing the HDF5 library. We describe an implementation based on a Write-Ahead Log (WAL) for metadata within the HDF5 library. In the event of a crash during the lifetime of an application using HDF5, this WAL can be used by the library or an external tool to restore the metadata within an HDF5 file to a self-consistent state,...
HDF5 (with NeXus) is becoming the standard in many X-ray facilities. HDF5 viewers are needed to allow users to browse and inspect the hierarchical structure of HDF5 files, as well as visualize the datasets inside as basic plots (1D, 2D). Having such a viewer on the web is especially interesting since it allows users to browse files remotely without having to...
The Advanced Photon Source (APS) at Argonne National Laboratory (ANL) is in the midst of an upgrade project that includes the replacement of the entire storage ring with a ring based on a multi-bend achromat lattice design. This new storage ring will increase the APS’s brightness by factors of 500, depending on x-ray energy, and make the APS the brightest hard x-ray synchrotron source in the...
The Swiss Light Source (SLS) is undergoing a significant hardware upgrade with the SLS 2.0 program, presenting an opportunity to address software challenges, particularly in beamline and experiment control systems. With official endorsement for deployment at the SLS, the new Beamline and Experiment Control system (BEC) provides a unified solution for beamlines, overcoming past challenges of...
The VISA platform has been proposed as a common and local portal for data analysis services for the photon and neutron community. It seeks to abstract away site and experiment specific configuration with ready-to-use virtual data analysis environments and it keeps track of interactive sessions.
At the European XFEL, we have gained first experience with using the VISA platform for schools and...
MicroTomo2 is an X-ray imaging station installed at the STAR accelerator facility of the University of Calabria. The STAR accelerator, currently under construction, will produce photons with energies up to 350 keV generated by the laser light-electron collisions using the Thomson back-scattering phenomenon. The MicroTomo2 experimental station will provide full-field X-ray radiography and...
The ESRF extremely brilliant source (EBS), Europe's first high energy 4th generation synchrotron, started user operation in August 2020. To benefit from the exceptional high quality X-rays produced by the ESRF-EBS all the experimental control and data analysis pipelines have been significantly upgraded on all the EMBL-ESRF Joint Structural Biology Group (JSBG) beamlines at the ESRF. On the...
Neutron tomography is a crucial tool for material examination, but ring artifacts can significantly decrease data quality and complicate tasks like segmentation and morphological analysis. The Block-Matching and 3D filtering (BM3D) algorithm, known for mitigating vertical streaks in sinograms and addressing the root cause of ring artifacts, is unfortunately slow and CPU-intensive. We introduce...
With pre-trained large models and their associated fine-tuning paradigms being constantly applied in deep learning, the performance of large models achieves a dramatic boost, mostly owing to the improvements on both data quantity and quality. Next-generation synchrotron light sources offer ultra-bright and highly coherent X-rays, which are becoming one of the largest data sources for...
In order to provide higher quality X-ray beams, synchrotron radiation light sources are becoming increasingly complex, including dozens of highly delicate optical devices and sensitive experimental setups. This creates a common problem for all users, namely the need to quickly and accurately adjust the attitude and optimize the beamline to achieve the optimal photon beam and experimental...
Google team showed a new Adam-tuned optimization solver for deep neural network training called LION (EvoLved Sign Momentum) by thousands of hours of training at TPU cluster, 2023. It is more memory-efficient than Adam as it only keeps track of the momentum and cuts the epsilon and a group of momentum parameters off. We applied “LION” solver to the physics-informed neural network-PtychoPINN...
Operando Catalysis experiments involve many different devices and processes running simultaneously and on different timescales. Data from sample environment which is changing over time has to be correlated with data from measurement techniques acquired using X-Ray beamlines. Users performing these experiments commonly face a daunting task of collating all their data together from different...
SOLEIL II project is set to commence in 2025. This project aims to develop an ambitious Diffraction Limited Storage Ring (DLSR) that will enhance brilliance, coherence, and flux. Additionally, the upgrade will include the improvement in the experimental techniques on the beamlines. Automation has been prioritized to adress evolving requirements and simplify user experiences at the beamlines...
Stable and continuous operation of large-scale distributed control systems are based on well- established configuration management and data logging. During service interruptions, like power cuts, hardware failures, network outages, planned maintenance and software deployment, parts of control system components may lose crucial configuration and restoring them to working condition may take...
BEC Widgets is an innovative Qt-based GUI framework specially designed to provide graphical user interfaces for the Beamline Experiment Control (BEC), tailored for users at the Swiss Light Source at Paul Scherrer Institute. BEC Widgets ensures seamless integration and a plug-and-play experience that significantly improves workflow efficiency and interactivity for beamline scientists.
With...
The Swiss Light Source at the Paul Scherrer Institut is undergoing an upgrade to a 4th generation synchrotron, presenting an opportunity to enhance its current software stack. To consolidate efforts, a software package for Beamline and Experiment Control (BEC) has been developed, primarily written in Python and leveraging established software tools. For the underlying hardware abstraction...
The easy integration of sample environment hardware into Bluesky represents a crucial step for achieving the goals for automation of in-situ and operando experiments at photon sources required by the ROCK-IT project. The use of Ophyd as a common hardware abstraction layer facilitates the integration of sample environment hardware into an existing Bluesky environment. Based on the metadata and...
Computed tomographic microscopy is one of the milestone techniques of X-ray science. The I- and S-TOMCAT beamlines of the Swiss Light Source (SLS) specialize in in-vivo, in-situ and operando high-speed tomography with over 10 kHz sustained frame rates. The improved flux and brilliance of the new TOMCAT beamlines present new frontiers for dynamic applications. Hence, they will serve as a...
Contemporary web applications are exceptionally efficient and often use a complex structure of frameworks and libraries for front-end development. This escalation in technical complexity requires a team with diverse, specialized skills, inevitably increasing the cost of web development. DonkiWeb is a simple, control system oriented, web SCADA that follows the KISS ( Keep It Simple, Stupid)...
These last years, ILL made a committed effort to implement data reduction for a large share of its instruments into the Mantid framework.
By doing so, ILL provides its users with access to the common Graphical User Interface (GUI) of Mantid (Mantid Workbench), bringing the effort on data reduction to a wide audience.
Taking advantage of their proximity with users, Mantid ILL developers...
POWTEX is a high-intensity TOF diffractometer at the FRM-II research reactor in Garching bei München, Germany. The instrument will serve the needs of the solid-state chemistry, geoscience, and materials science communities through neutron scattering measurements on POWder and TEXture samples.
The important part of the data processing workflow at POWTEX is data reduction, i.e. correction...
At large synchrotron radiation facility SPring-8, accommodating diverse experimental user demands requires various operation modes and advanced task management for measurement proxies. Despite these complexities, the need for efficient operation through automation has become increasingly essential. This presentation discusses initiatives to enhance operational efficiency at SPring-8, which...
Graphical User Interfaces (GUIs) play a crucial role in defining the user-friendliness of software applications, enhancing efficiency by structuring and organizing the information presented. The ROCK-IT project aims to develop all necessary tools for the automation and remote access of synchrotron-based in-situ and operando experiments, using operando catalysis experiments as a pilot case....
FDA (Fast Data Analyser) is an application developed at ALBA Synchrotron to analyse the data produced with X-ray Absorption Spectroscopy (XAS), as well as with X-Ray Diffraction (XRD). It provides a quick and convenient way of loading, processing, and analysing the data with different methods, such as XANES (X-ray absorption near edge structure) normalization, EXAFS (Extended X-ray absorption...
On-the-fly 3D data reconstruction is a challenging need in synchrotron micro-tomography facilities. This presentation will have two parts. In the first part we will show the data workflow and infrastructure implemented at the FaXToR beamline of the ALBA Spanish synchrotron to follow dynamics inside the samples. In the second, we will present a new approach to implement tomography processing...
The Elettra Synchrotron, located in Italy near Trieste, has been operating for users since 1994 being the first third generation light source for soft X-rays in Europe. To stay competitive for world-class photon science, a massive upgrade of the storage ring has been planned in 2025. The goal is to build an ultra low emittance light source with ultra-high brilliance in the same building as the...
Helmholtz-Zentrum Hereon operates multiple X-ray diffraction (XRD) experiments for external users and while the experiments are very similar, their analysis is not. Pydidas [1] is a software package developed for the batch analysis of X-ray diffraction data. It is published as open source and intended to be widely reusable. Integration is based on the ESRF’s pyFAI package.
Because the wide...
The ESRF EBS upgrade has meant a tremendous increase in X-ray photon flux in the experimental beamlines (BLs), requiring faster and more advanced DAQ techniques. Faster and larger 2D detectors are being developed and need to be integrated into the BL control system. The BLISS control software, designed to push the BL instrumentation to its limits, fulfills the opportunities offered by this...
Providing a remote experimental environment for facility users and utilizing advanced computational resources of cloud environments for instrument control are challenging issues for scientific user facilities such as J-PARC MLF. Therefore, we have developed a hybrid cloud-based instrument control system by modifying IROHA2, which is the standard instrument control software framework for...
The ICAT server is a metadata catalogue to support large facility experimental data. python-icat is a Python client library for ICAT. The package provides a collection of modules for writing programs that access an ICAT service. The most important features include the dynamic generation of Python classes to represent the entity object types from the ICAT schema that is automatically adapted...
The Experiment Control System Reliability project aimed at enhancing the reliability of the beamlines operation at MAX IV. The project focused on the scanning software and the support process, identifying improvements to reduce downtime and increase system reliability. It was structured in innovative organisation to learn from one stable beamline and, in parallel, to improve the reliability...
The FMX (Frontier Microfocusing Macromolecular Crystallography) beamline at the NSLS-II light source has been developing a new experimental station for fixed target time-resolved serial crystallography on biological systems. We present here the controls-system for a chip scanner to enable the rapid collection of large numbers of room temperature crystallographic measurements on biological...
The Karabo control system is used facility-wide at European XFEL (EuXFEL) to steer experiments and collect scientific data. As a user-centered facility, EuXFEL deals with ever-changing requirements and faces often the need to integrate new instrumentation, or even to cope with user-provided hardware on relatively short notice.
Tango is a well-established control system, and many hardware...
Mantid Imaging is a user friendly, interactive, open source and free to download GUI application for Linux and Windows. Mantid Imaging is used by the ISIS Neutron and Muon Source Instrument: IMAT, scientists and visiting users for data reduction, reconstruction, and live viewing of 2D and 3D data. The software application is designed to be intuitive such that users of varying technical ability...
METABOLATOR is a web application for automated analysis of microcalorimetric metabolic data using Monod's equation. The software was developed in collaboration between the Institute of Resource Ecology and the Department of Information Services and Computing at Helmholtz-Zentrum Dresden - Rossendorf (HZDR), and is now offered as a web service for the community. In addition to publishing the...
Karabo is a device-based distributed control system used to implement the control and data acquisition systems of the 3 tunnels and 7 instruments of European XFEL.
Motion systems, both PLC and non-PLC-based, are controlled through single and multi-axis Karabo devices with standardised interfaces and behaviour.
Combined motion of multiple motors is provided through multi-axis devices,...
The Balder beamline is placed at the 3 GeV storage ring at MAX IV Laboraty (a 4th generation synchrotron) and is dedicated to X-ray absorption and emission spectroscopy in the energy range of 2.4–40 keV [1]. The beamline can deliver a very high photon flux of 1013 ph/s and is suitable for experiments under in situ / operando conditions. This kind of experiment requires support for fast...
The research data management group at Helmholtz-Zentrum Berlin is applying FAIR data management. Data starts to be moved from specific file formats into NeXus/HDF5 files. The standardization program involves the conversion of already generated data, and the automation for the creation of NeXus files from new experiments (example: Bluesky). Our tool, NexusCreator, allows to separate the...
The information technology (IT) requirements of complex data analysis have been growing steadily in the last decades. Among the techniques, readily performed at synchrotrons, computational tomography (CT) is one of the most IT resource demanding. This holds both for the computing (CPU and GPU) and storage (I/O) requirements. Taking into account the faster and larger detectors (exceeding 5...
Recent advances in synchrotron x-ray instrumentation have enabled the rapid acquisition of x-ray diffraction data from single crystals, allowing large contiguous volumes of scattering in reciprocal space to be collected in a matter of minutes, with data rates of several terabytes per day. NXRefine implements a complete workflow for both data acquisition and reduction of single crystal x-ray...
The SciCat[1] metadata catalog is in use at several scientific user facilities. SciCat stores metadata about datasets (both raw and derived), proposals, and instruments. When introduced into SciCat, each dataset is given a unique identifier. Datasets can be searched for and browsed in a web portal. Authorization rules can be applied to allow fine-grained access to datasets for staff and users....
The Open Electron Microscopy Data Network (OpenEM) is a Swiss-wide collaboration to improve data management at electron microscopy (EM) facilities and make the dissemination of EM data open and FAIR (findable, accessible, interoperable, and reusable). The collaboration is based around a central SciCat instance hosted at the Paul Scherrer Institute (discovery.psi.ch), which stores metadata from...
PiXiu is a program bridging the gap between first-principles density functional theory (DFT) and inelastic neutron scattering (INS) dynamical structure factor calculations. In addition to performing powder-averaging for powder samples, PiXiu is capable of calculating the dynamical structure factor in four dimensions for single crystals. Under the hood, PiXiu combines the Quantum Espresso and...
In Macromolecular Cystallography (MX) experiments at Diamond Light Source (DLS), crystals are located by scanning samples around a rectangular grid, under synchrotron light, and looking for diffraction. The speeds of these scans have been limited by the position capture unit which co-ordinates trigger signals - the Zebra. Using a Zebra, these scans have historically had a step-like motion,...
Typically in the past, data collection software developed by individuals with non-software-focused roles often placed low priority on the User Interface (UI), considering it time-consuming and prone to software bugs. However, for software developers primarily tasked with delivering reliable software, the aim is to create a robust platform enabling users of all levels to efficiently collect...
Cross-institutional data sharing is still a challenging problem for the large datasets collected at the Advanced Photon Source (APS). Sector 6 at the APS routinely collects single-crystal x-ray diffraction data at a rate of several terabyes per day, which is streamed for automated data reduction in local file stores. Such large data volumes make it challenging to collaborate on data analysis...
ROCK-IT is a collaboration which aims to demonstrate the ability to perform complex operando catalysis experiments in a highly automated way, enabling remote operation. The project finds common solutions between different facilities which have various control systems and infrastructure. Ophyd provides a common abstraction layer to Tango, EPICS and SECoP. In the demonstrators of this project,...
Insights into "catalysts at work" are of high interest to academic and industrial users, prompting the ROCK-IT project partners DESY, HZB, HZDR, and KIT to enhance capabilities for in situ and operando experiments. ROCK-IT aims to meet the need for a holistic workflow through common remote access protocols, FAIR-data management standards, automation, robotics, experiment and beamline control...
A collaboration between Diamond Light Source and the University of Oxford funds a PhD project aimed at understanding the technical, social and policy implications of adopting the FAIR (Findable, Accessible, Interoperable, Reusable) Principles, and evaluating the effects of its implementation on synchrotron data. The work focuses on the early stage of the science life cycle, when the scientists...
Historically, data workflows for single crystal inelastic neutron experiments were using a pre-histogramming step, storing data for each detector as a histogram in energy transfer. Therefore, each bin of these histograms is then a measurement of the dynamic structure factor (or double differential cross section). A useful representation of this data includes transformation into momentum...
High Energy Photon Source (HEPS) is currently under construction. Prior to the installation of all equipment, tasks such as beam tuning can only be carried out based on the experience of the beamline staff and some mathematical estimations, in an imaginary space. While effective, this approach is not convenient. Here, taking the Low-dimension Structure Probe beamline of HEPS as a prototype, we...
The High Energy Photon Source (HEPS), currently under construction, represents an advanced experimental platform facilitating breakthroughs in fundamental scientific research. Boasting over fourteen experimental beamlines, HEPS offers a rich array of research domains and employs complex analytical methodologies. Consequently, it faces formidable data processing challenges, including managing...
The High Energy Photon Source (HEPS) encompasses a variety of experimental types, including diffraction, scattering, imaging, and spectroscopy. The data generated from these experiments are highly dimensional, uncertain, and computationally complex. Considering the users' needs for interoperable data analysis and high-performance I/O processing, it is necessary to organize and manage the data...
Continuous Integration/Continuous Delivery (CI/CD) can facilitate the development and integration process of advanced photon source software and algorithms. A significant amount of repetitive tasks, such as compiling, testing, deploying, and releasing, may impede the progress of algorithm and software development. Developers often need to expend considerable effort maintaining servers and...
SIRIUS is a 4th generation synchrotron light source facility that was designed, built and is operated by the Brazilian Synchrotron Light Laboratory (LNLS/CNPEM). Currently, SIRIUS has 6 fully operational beamlines and other 8 beamlines in technical commissioning, scientific commissioning or installation phases. Most SIRIUS beamlines currently have their experiment control solutions based on...
High-level control systems and Graphical User Interfaces (GUIs) play essential roles in enabling users to interact with complex systems, particularly in beamline environments where precise control and real-time monitoring are crucial. Beamlines must provide tools that facilitate this interaction. On the Quati beamline¹, which is the XAS beamline of Sirius, the experiment control system is...
Synchrotron radiation (SR) light sources provide precise and deep insights that have been driving cutting-edge scientific research. Facing to SR scientific big data challenge, it is urgent to develop artificial intelligence (AI) analysis methods to enhance research efficiency including novel material discovery[1]. In this talk, I will focus on the construction of “Intelligence Terminal”...
For more than 20 year, the Crystallography Open Database (COD) collects published crystal structure data and makes it available on the Web under the CC0 license in an organised, machine readable and searchable for. Currently, the collection of the COD has over 500 thous. records and is used for crystal analysis, material identification, DFT calculations, machine learning, teaching and much...
At Diamond Light Source, several Macromolecular Crystallography (MX) beamlines focus on, or include, completely automated data collection. This is used primarily for high throughput collection, which has historically meant several hundred samples per day. Diamond is building its next generation, service-based, data acquisition platform Athena using NSLS-II’s Bluesky experiment orchestration...
SIRIUS is a 4th generation synchrotron light source facility that was designed, built and is operated by the Brazilian Synchrotron Light Laboratory (LNLS/CNPEM). Currently, SIRIUS has 6 fully operational beamlines and other 8 beamlines in technical commissioning, scientific commissioning or installation phases.
The distributed control system is based on EPICS and the software solutions...
Karabo is the control and data processing framework operating the instruments and photon beam-lines at the European XFEL. Its event driven nature is enabled by a central message broker that distributes control information to subscribed software processes.
Originally, Karabo was developed using the Java Messaging System (JMS) broker, and the OpenMQc library to interface it from C++ and Python....
BEC (Beamline Experiments Control) is a new Python-based experiment control software currently developed within the Paul-Scherrer Institute (PSI). It will be available to Swiss Light Source (SLS) users starting from January 2025 after the SLS 2.0 upgrade program. BEC provides services dealing with every aspect of a modern beamline control software.
Blissdata is a Python library developed at...
The Advanced Photon Source (APS) at Argonne National Laboratory is at the forefront of facilitating groundbreaking scientific research by providing state-of-the-art X-ray capabilities. Recognizing the critical role of software in maximizing the scientific output of user facilities, APS has embarked on a strategic deployment of Bluesky, a comprehensive software framework designed for data...
The learning curve for beamline control systems is often challenging due to the use of command line controls, or various custom made GUIs. Access to beamlines is limited and time constrained, so learning command line controls or scripts takes up valuable time that could be used for the experiment.
This talk focuses on the creation of a web browser application that acts as a beamline...
Over a decade ago, Sardana [1] integrated generic continuous scans [2], initially meeting only basic requirements. More complex scenarios, were implemented by migrating generic logic into plugins, often relying on hooks or ad-hoc solutions. In the past year, SOLARIS Synchrotron and the Sardana Community co-hosted a collaborative continuous scans workshop [3] with participation from similar...
[HTTomo][1] stands for High Throughput Tomography pipeline for processing and reconstruction of parallel-beam tomography data. The HTTomo project was initiated in 2022 at Diamond Light source in anticipation of major data increase with the Diamond-II upgrade. With the support of modern developments in the field of High Performance Computing and multi-GPU processing, the main goal is to...
Since the user program of China Spallation Neutron Source (CSNS) was open in 2018, eight neutron beamlines have been operational, with the number of users reaching approximately 6000. The data portal of CSNS provides the services in data access, data reduction, data analysis and simulations for over 130,000 experimental runs. It is continuously evolving to meet the requirements of users for...
4th generation synchrotron sources provide two orders of magnitude more coherent photons, and thus the ability to collect coherent X-ray imaging datasets faster and/or with a higher resolution. Consequently, the increased volume of data requires dedicated tools to fully take advantage of the improved coherent flux.
PyNX[1,2,3] is developed at ESRF - it has been written from the ground up...
The McStas[1-3] neutron Monte Carlo ray-tracing simulation project was started at RISØ in 1997 and has thus now served the neutron scattering community for more than 25 years.
The presentation will give a brief overview of highlights from the 25 year history of McStas and further update the NOBUGS community on recent developments and future plans for both McStas and its X-ray counterpart...
Data from virtual experiments are becoming a valuable asset for research infrastructures: to develop and optimize current and future instruments; to train in the usage of the instrument control system; to study quantifying and reducing instrumental effects on acquired data. Furthermore large sets of simulated data are also a necessary ingredient for the development of surrogate models...
During beamtimes, critical decisions on how to proceed with an experiment must be made constantly. As a result, it is important to provide feedback with the best possible data analysis, mostly in the form of visualizations, with the lowest possible latency. For low data rates, writing and monitoring a file works well. However, processing tens of gigabytes per second is difficult with a...
The Canadian Macromolecular Crystallography Facility (CMCF) consists of two beamlines (CMCF-ID and CMCF-BM). The beamlines are operated through a modern computer software system for on-site and remote collection. It consists of a user-friendly graphical user interface for experiment-focused data collection (MxDC), a laboratory information management system for remote planning, experiment...
The Data And Metadata iNspection Interactive Thing (DAMNIT) is a tool developed by the Data Analysis group at the European XFEL (EuXFEL) to help scientists and users effortlessly create overviews of their experiments.
Traditionally, at EuXFEL, many user groups and beamline scientists use spreadsheets and electronic logbooks to track experimental settings, metadata, and analysis results....
Authors: Piero Gasparotto, Luis Barba, Hans-Christian Stadler,
Greta Assmann, Henrique Mendonça, Alun W. Ashton, Markus Janousch,
Filip Leonarski and Benjamı́n Béjar
TORO (TOrch-powered Robust Optimization) [1] is a new algorithm
for indexing diffraction patterns, applicable when the unit cell
geometry is known. Originally based on the PyTorch framework a
dedicated version in CUDA has...
Providing users with remote and random access to structured data is emerging as an important challenge for user facilities in the next decade. Our peers in industry and in other scientific areas are building such services. Tiled is a solution tuned to the requirements of user facilities, applying web standards and widely-adopted technologies in numerical computing to address search, random...
In today’s research facility’s landscape, experimental data management, metadata catalogue and access are playing a vital role on enabling the full research lifecycle, allowing the users communities and scientific institutions to collaborate, transfer and share data on a well-defined collaborative platform.
Following the community best practices on delivering and exporting data, SESAME is...
SOLEIL Information System has a 20-year legacy characterized by non-uniform and siloed IT solutions that have been continuously evolving in response to changing business requirements, thereby increasing its complexity. A redesign of our information system architecture was deemed necessary to address this challenge, requiring a new, homogeneous, and flexible approach.
Currently, we are in...
In recent years, China's advanced light sources have entered a period of rapid construction and development. As modern X-ray detectors and data acquisition technologies advance, these facilities are expected to generate massive volumes of data annually, presenting significant challenges in data management and utilization. These challenges encompass data storage, metadata handling, data...
SciCat is an open-source data catalog providing data management, annotation, and publishing features for scientific facilities (https://scicatproject.github.io/). It enables tracking of data provenance, annotation with metadata, and publication of datasets with a unique DOI. SciCat is built on a flexible microservice architecture, allowing easy configuration for diverse use cases. The adoption...
In the ever-expanding landscape of data management, navigating the diverse array of metadata catalogs such as SciCat, data publications on Invenio derivatives, and internal archives presents a formidable challenge. However, with the right strategies, this mosaic of data can be effectively combined and represented to unlock its full potential. In this talk, we delve into the intricacies of data...
ALBA Synchrotron [1] is actively implementing FAIR data management principles [2] across all operational beamlines. Data is cataloged in ICAT, [3] preferably using the NeXus data format [4], alongside metadata sourced from various information systems.
To ensure all metadata is accessible for data interpretation and reuse, gathering beamline and experimental conditions during data collection...
ESS was born with open and reusable data in mind. Based on lessons
learned from other research infrastructures, the data pipeline for
experiments at the European Spallation Source ERIC (ESS) was outlined
from the very beginning and designed to allow for FAIR data and real time
data processing and analysis. In this presentation we will present the
integrated data pipeline at the ESS,...
The Dresden laser acceleration source (DRACO) is a state-of-the-art high-power ultra-short pulse laser system[1,2], that uses an Amplitude Technologies Pulsar architecture to form main and diagnostics beams at different focal lengths and target density conditions. The setup can deliver from 6J to 45J of pulse energy at a typical pulse duration of 30fs and a typical frequency of 1Hz. During the...
Using web applications in a Software As A Service approach is increasingly becoming an important route for science facilities to provide tools to their users.
Diamond has developed two such applications; the XAS data repository and web-CONEXS.
The XAS data repository is a database of XAS data collected on standard or well characterized compounds, facilitating the storage and retrieval of...
The well-known application Slack is primarily used for instant messaging and sharing memes. However, according to the people who make it, Slack is "... a messaging app for business that connects people to the information that they need".
For instrument staff at a scientific facility that information might be the current state of hardware; statuses of various data acquisition services;...
Building resilient data streams for large-scale experiments is a critical problem in modern settings in which advanced computing techniques are more tightly integrated with data collection activities. Resilience-aware application solutions must include 1) policy management, in which science-level goals are presented to the system; 2) data movement telemetry, which captures system responses;...
With [FAIR principles][1] increasing in importance within many fields, the challenge of ensuring that these principles are fully embedded in research outputs applies not just to the (meta)data itself, but also to the methods used to process and analyse it. If metadata associated with the raw data is lost in the analysis process, the Findability (which relies on this metadata) may be...
In the rapidly evolving landscape of web development, the performance of backend technologies is a critical factor influencing scalability, efficiency, and user experience. This research aims to present a comprehensive performance comparison of Node.js, Rust, Go, and Python — four prominent technologies widely adopted in web application development. Through a series of systematic tests, we...
Software engineers, including those involved in scientific software, often mention that they follow best practices. While sounding like an excellent idea, this is often near impossible. Frequently, there is an opinion on what is the best practice. Some other things, like software licenses and naming conventions, are mostly left to the development team to decide. Our software team has developed...