Using published data for the development of processing and analysis tools: XAS data processing use case

In this talk we will discuss the relevance of published data in the development of new tools which can speed up processing and analysis. We will also explain the importance of metadata and how it was used in the development of the proposed tools. 

The basic idea is the development of workflows and workflow components which can be reused reducing the time researchers spend in manual tasks. We will show how porting the tools to the galaxy workflow management system opens possibilities of exploratory analyses. 

We will also show how these tools also generate the necessary metadata to create FAIR digital objects which can be published as supporting information for the results obtained. This approach requires minimum intervention from the researcher performing the processing and analysis tasks. Consequently, these methods are ideal for improving the practices of publishing data, facilitate reproducibility of results, and support greater reuse of published data.

Researchers can already practice with actual data using the resources presented. Additionally, we invite the community to provide ideas for improvements of the tools and for supplying ideas for further development.

Click below to watch a recording of the presentation:

Further information at

Contact Abraham Nieva de la Hidalga


Photo of Abraham Nieva de la Hidalga

Dr. Abraham Nieva de la Hidalga (Cardiff University) joined the UK Catalysis Hub in 2019 as research associate in data management and software development. He has been working in the development of the UKCH Catalysis Data Infrastructure and Catalysis Research Workbench, these platforms are aimed at fostering the publishing of data and the creation of new processing and analysis tools. Additionally, since 2021, he has been collaborating in the development of the Physical Sciences Data Infrastructure (PSDI) a project of the Scientific Computing Department of the STFC. 

Comments are closed.